locked
On-prem SSAS to Data Lake Store? RRS feed

  • Question

  • What would be the best way to maintain a data flow from an on-premise SSAS to Data Lake Store? Unfortunately, because there is an application on top of SSAS that writes back to the cubes, the extract has to originate from Analysis Services not the sources SSAS is built from. Any good ideas on how to manage this data flow - especially incremental loads. 
    Monday, June 10, 2019 2:41 PM

All replies

  • Well do you think you can let us know a bit more on "how does the application write backs to the cube" ?

    Thanks Himanshu

    Thursday, June 20, 2019 2:30 AM
  • Hi,

    You can refer this sample article to write the sql or mdx - https://www.sqlshack.com/effectively-extract-data-from-olap-cube-by-relying-upon-tsql/

    You can use this approach to load data to data lake store:

    1. You can create an SSIS Package with Execute SQL Task to extract summary data from SSAS cube and load that to a folder location.


    2. Following which you can use Azure Data Factory  to copy data to Azure Data lake Store using ADF Copy activity.

    Let me know in case you need more details.


    Regards,
    Sriharsh.
    Blog: https://sriharshadari.com/, Please mark "Propose As Answer" if my answer helped



    • Edited by Srihad Tuesday, July 16, 2019 2:14 AM
    Thursday, July 11, 2019 9:42 PM