locked
SSAS Tabular Processing - How to control memory during processing? RRS feed

  • Question

  • Hello All,

    I have a Partition in SSAS Tabular with size 20 GB. During the processing of this partition, it occupies up to 120 GB. 

    Do we have any option to control the processing memory limit to 50 GB so that remaining GB can be used for browsing?

    Cube processing can take more time for us but it should not occupy all memory during processing.

    Appreciate your suggestions.

    Thanks,


    Saturday, May 9, 2020 9:02 PM

All replies

  • Hi,

    I don't think we have option to control cube processing memory limit. There are three memory setting for SSAS Tabular.

    • Low memory defaults to 65% of the physical memory on the server. When the low memory limit is exceeded, the SSAS cleaner begins to remove things from cache and looks for opportunities to free up memory.
    • High (or total) memory defaults to 80% of the physical memory. When the total memory limit is exceeded, the cleaner more aggressively removes items from cache and does whatever it can to reduce memory use to below the configured limit.
    • VertiPaq (technically the xVelocity in-memory storage engine) memory defaults to 60% of the physical memory and is further defined by SSAS paging policies. In Tabular mode only, this setting defines the memory limits specific to the in-memory storage engine.

    looks like you need to fine tune your cube processing -  here are some step to improve SSAS performance and how to reduce memory consumption.

    https://www.sentryone.com/white-papers/3-hacks-to-improve-ssas-memory

    Regards

    Ram


    http://uk.linkedin.com/in/ramjaddu

    Saturday, May 9, 2020 9:44 PM
  • So the short answer is that there is no direct setting to control how much memory is used during processing. Therefore you will probably need to change you partition design to achieve this. 

    Processing is done transactionally which means that the engine will create a second copy of your partition and process into that and then switch the two partitions over. So it will need at least a second 20Gb plus some working space. I don't know why it's using 100Gb in your case, that seems quite excessive, but it probably depends on how many rows you have in your partition and how wide those rows are. By default SSAS will read and compress the rows in 8 million row segments. So if your partition only has around 8-10 million rows, but they are really wide with lots of large highly unique columns that might explain this sort of memory usage.

    Probably the simplest fix would be to create 2 smaller partitions and then set the maxParallelism setting when you process to 1, this should hopefully roughly half the memory usage.


    http://darren.gosbell.com - please mark correct answers

    Sunday, May 10, 2020 2:04 AM
  • The memory settings of SSAS are serverwise, we could not do it for particular partitions. 

    I was wondering why it gets to 120GB since it should have only one copy for storing and some extra memory space.

    Maybe we could look into direction as to decrease the model or partition size. 

    See if the tips helps any:  Reduce the Size of an Analysis Services Tabular Model – Part 1


    MSDN Community Support Please remember to click Mark as Answer; the responses that resolved your issue, and to click Unmark as Answer if not. This can be beneficial to other community members reading this thread. If you have any compliments or complaints to MSDN Support, feel free to contact MSDNFSF@microsoft.com.

    Monday, May 11, 2020 9:41 AM