locked
copy data from Azure Data Lake Store to Azure Analysis Server database RRS feed

  • Question

  • Hi,

    Is there any way to copy the data from Azure Data Lake to Database which is present in Azure Analysis service using azure data factory? I am trying to use copy activity to do the same task but I don't know how to specify the Analysis Server Database as the destination of output dataset.

    Thanks

    Jai



    • Edited by i_JaiSoni Wednesday, November 30, 2016 12:37 PM
    Wednesday, November 30, 2016 12:35 PM

All replies

  • technically you cannot load/push data into Azure Analysis Services, you always have to run a Process/Refresh-Command on the Analysis Server itself which then refreshes the data from predefined/existing sources 

    for ADF, you do not have a connector [yet]
    if there would be one, you would could only trigger a Process/Refresh-Command against the Azure Analysis Service from there

    at the time being, a possible workaround might be to use a custom .net activity which runs the command on AAS

    hth,
    -gerhard


    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Wednesday, November 30, 2016 8:12 PM
  • You could store the data in Azure SQL Database and connect Analysis Services to it.
    Thursday, December 1, 2016 3:03 PM
  • For the data connector not in data factory support list, you can write custom activity to access your data

    Here is the doc: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-use-custom-activities

    Thanks,

    Charles

    Monday, December 5, 2016 6:22 AM
  • Thanks, Gerhard Brueckl. I'll look into it.
    Monday, December 5, 2016 12:27 PM
  • Thanks, Alexander.

    Yes, I can. But i want to perform some transformation on data using U-SQL. That's why i am using data lake store.

    Monday, December 5, 2016 12:28 PM
  • Thanks,Charles.

    I didn't knew that we can do that using custom activity. I'll try it using custom activity and will let you know.

    -Jai

    Monday, December 5, 2016 12:29 PM
  • when building Custom .Net Activities this tool might also be interesting for in order to speed up your development:

    https://github.com/gbrueckl/Azure.DataFactory.CustomActivityDebugger

    it allows you to debug your activity locally without having to upload everything after each change

    hth,
    -gerhard


    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Monday, December 5, 2016 2:34 PM
  • What type of data store should I select for analysis server database to create a linked service or output dataset ?

    How do I create an output dataset for the same?

    When I create a new dataset or linked service then it asks me to select the type of data store like  Azure Blob Storage, Azure DocumentDB, Azure SQL Data warehouse etc. But I don't see anything for SSAS.

    Please help.

    Thanks

    -Jai


    • Edited by i_JaiSoni Wednesday, December 7, 2016 12:59 PM
    Wednesday, December 7, 2016 12:01 PM
  • both have to be of type custom and cannot be created with the wizzard - here are some examples:

    {
      "$schema": "http://datafactories.schema.management.azure.com/schemas/2015-10-01/Microsoft.DataFactory.LinkedService.json",
      "name": "MyCustomFTP",
      "properties": {
        "type": "CustomDataSource",
        "typeProperties": {
          "host": "<config>",
          "username": "<config>",
          "password": "<config>",
          "enableSSL": true
        }
      }
    }

    {
      "$schema": "http://datafactories.schema.management.azure.com/schemas/2015-10-01/Microsoft.DataFactory.Table.json",
      "name": "MyAdfDataset",
      "properties": {
        "type": "CustomDataset",
        "linkedServiceName": "MyCustomFTP",
        "typeProperties": {
          "datetimeFormat": "dd-MMM-yyyy HH:mm:ss",
          "locale": "en-US"
        },
        "availability": {
          "frequency": "Day",
          "interval": 1,
          "style": "EndOfInterval"
        },
        "external": true
      }
    }

    hth,
    -gerhard


    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Monday, December 12, 2016 8:18 AM