locked
Azure SSAS memory RRS feed

  • Question

  • Hi

    I am checking the pricing for Azure SSAS and I see S0 with 10GB of capacity which I believe is RAM memory and If I am deploying any SSAS model on Azure I believe it stores entire data in compressed format on RAM only. 

    Can someone let me know what this 10 GB capacity is equivalent to relational database size ( I mean if my database is 1 TB then how much it would be in compressed format on Azure SSAS )

    


    Monday, September 30, 2019 11:28 AM

Answers

  • Hello,

    Apologize for the delay in response.

    1 TB source data fit into a 350 GB Tabular model.

    Here is an example for the same: 1 TB of TPC-DS source data fit into a 350 GB Tabular model. This is not a sensational compression ratio, but the TPC-DS tables are rather wide and not optimized for column-based compression. Still, smaller models are easier to handle, so I looked for low-hanging fruit to reduce the model size and optimize the data import.

    For more details, refer “Building an Azure Analysis Services Model on Top of Azure Blob Storage”.

    Hope this helps.

    Monday, October 7, 2019 11:05 AM

All replies

  • Hello,

    How Data stored in Azure Analysis Services?

    Azure Analysis Services uses Azure Blob storage to persist storage and metadata for Analysis Services databases. Data files within Blob are encrypted using Azure Blob Server-Side Encryption (SSE). When using Direct Query mode, only metadata is stored. The actual data is accessed through encrypted protocol from the data source at query time.

    For more details, refer “Azure Analysis Services – Your data is secure”.

    Analysis Services pre-allocates a modest amount of memory at startup so requests can be handled immediately. Additional memory is allocated as query and processing workloads increase. By specifying configuration settings, you can control the thresholds at which memory is released.

    For more details, refer “Memory properties”.

    You can use Azure Pricing Calculator to compare equivalent to relational database size.

    Hope this helps.

    ----------------------------------------------------------------------------------------

    Do click on "Mark as Answer" and Upvote on the post that helps you, this can be beneficial to other community members.

    Tuesday, October 1, 2019 6:33 AM
  • Thanks for your reply but still your threads did not answered question that if I have a 1 TB of relational database and if I create tabular model based on all the dim and fact tables from 1TB relational database then what would be the size of Tabular model.
    Tuesday, October 1, 2019 7:41 AM
  • Hello,

    Apologize for the delay in response.

    1 TB source data fit into a 350 GB Tabular model.

    Here is an example for the same: 1 TB of TPC-DS source data fit into a 350 GB Tabular model. This is not a sensational compression ratio, but the TPC-DS tables are rather wide and not optimized for column-based compression. Still, smaller models are easier to handle, so I looked for low-hanging fruit to reduce the model size and optimize the data import.

    For more details, refer “Building an Azure Analysis Services Model on Top of Azure Blob Storage”.

    Hope this helps.

    Monday, October 7, 2019 11:05 AM
  • Hello,

    Just checking in to see if the above answer helped. If this answers your query, do click “Mark as Answer” and Up-Vote for the same. And, if you have any further query do let us know.

    Friday, October 11, 2019 8:51 AM
  • Hello,

    Following up to see if the above suggestion was helpful. And, if you have any further query do let us know.

    Monday, October 14, 2019 8:25 AM