none
TotalMemoryLimit

    Question

  • Hy everyone.

    I have a doubt about this configuration value.

    I know that it refers to the total physical memory available in the system. So If I have, for example, 32 Gb. of memory, and I set this value to 80, I suppouse that the limit is 25,6 Gb. My doubt is: 25,6 Gb. is the maximum value of memory that AS will use before begin to free shrinkable memory? or 25,6 Gb. is the maximum used memory in all the system before AS begin to free shrinkable memory?.

    I have a system in with AS has 12 Gb. of memory in use, and the TotalMemoryLimit and LowMemoryLimit are both high above 12 Gb. Besides, the system has 15 Gb. of free memory, but AS is continously reducing de shrinkable memory, and I think that this behavior is taking down my AS performance.

    I can see always high evictions/sec performance counter values.

    Thanks.




    Wednesday, February 10, 2010 1:11 PM

Answers

All replies

  • Hi there,

    Here is some further information that may help.  It is my understanding that Analysis Services will consume up to 25.6 GB of memory, but not past total available memory.  So if you have 24gb free memory it will hit a limit and start paging.

    http://greg.blogs.sqlsentry.net/2009/06/analysis-services-memory-limits-part-ii.html

    Performance Counters and MDX Studio might help with performance investigations.

    cheers,
    Andrew
    Andrew Sears, T4G Limited, http://www.performancepointing.com
    Wednesday, February 10, 2010 7:05 PM
  • It is my understanding that Analysis Services will consume up to 25.6 GB of memory, but not past total available memory. 
    That's not quite correct, once it gets past the TotalMemoryLimit it will go into "panic" mode and aggressively clean shrinkable RAM (so it may temporarily go over the 25.6Gb limit, but it will do all it can to get back under this limit). The memory limits all refer to the memory used by SSAS, it does not look to see if other processes have grabbed large amounts of RAM. If you did have other processes consuming large amounts of RAM you would want to lower the memory settings for SSAS to avoid paging to disk.



    The following is from http://msdn.microsoft.com/en-us/library/cc966526.aspx

    Memory

    SSAS has a special memory “cleaner” background thread that constantly determines if it needs to clean up memory. The cleaner looks at the amount of memory used. The following basic processes are used by the cleaner to control amount of physical memory used by Analysis Server:

    • If the memory used is above the value set in the TotalMemoryLimit property, the cleaner cleans up to this value.  

    • If the memory used is under the value set in the LowMemoryLimit property, the cleaner does nothing.

    • If the memory used is between the values set in the LowMemoryLimit and the TotalMemoryLimit properties, the cleaner cleans memory on the need-to-use basis.

    If the value specified in any of these properties is between 0 and 100, the value is treated by SSAS as a percentage of total physical memory. If the value specified is greater than 100, the value is treated by SSAS as an absolute memory value (in bytes). Note that when Analysis Server is processing, if it requires additional memory above the value specified in TotalMemoryLimit, it will try to reserve that amount, regardless of the TotalMemoryLimit value.


    http://geekswithblogs.net/darrengosbell - please mark correct answers
    Thursday, February 11, 2010 1:53 AM
    Moderator
  • Thank you, Darren.

    This solve my doubt about this memory setting.

    But I am still confused about the behaivor of my AS under my configuration.

    System memory: 32 Gb.
    SQL Server (DBE): 5 Gb. max
    No other services or applications consuming memory.

    LowMemoryLimit: 70 (I thought it was reference to the global memory in the system).
    TotalMemoryLimit: 80

    With this configuration, I suppouse that AS doesn´t try to free memory becouse it will never raise 70% of 32 Gb. = 22,4 Gb.

    But if I look to evictions/sec counter, its always with high values (average 30 or 40 and with peaks of 200). It seems like every time that there are inserts/sec, also there are evictions/sec, as if AS don´t have enough space for the data it need to load in cache, but it hasn´t reach the LowMemoryLimit and there is plenty of free phisical memory in the system.

    The only unusual value I'm observing is that there is 23 Gb. of system cache looking in task manager info.

    Some Idea?

    Thanks one more time.

     

     

    Thursday, February 11, 2010 8:11 AM
  • Sounds like you might be hitting an issue with the file system cache. It was never really a problem with 32bit systems as it was inherently limited to 2Gb, but on large 64bit systems it can grow quite large. Chris Webb has an article with some good links here http://cwebbbi.spaces.live.com/blog/cns!7B84B0F2C239489A!4316.entry and Greg Gonzalez has some good background reading here http://greg.blogs.sqlsentry.net/2010/02/analysis-services-and-windows-file.html
    http://geekswithblogs.net/darrengosbell - please mark correct answers
    • Marked as answer by jvinualzar Friday, February 12, 2010 8:18 AM
    Thursday, February 11, 2010 12:00 PM
    Moderator
  • Thank you, Darren.

    I will look in that direction and I will tell you.

    Friday, February 12, 2010 8:18 AM