locked
SSAS 2008 R2 - memory usage RRS feed

  • Question

  • Server exists solely for SSAS ... has 400g of memory, yet MSMDSRV never uses more than 20g, do I need to set the preallocate amount to take advantage of available server memory ?

    Friday, August 16, 2013 1:19 PM

Answers

All replies

  • wow - 400 GB of memory is a lot /jealous

    what are the SSAS instance-level memory configuration values set at?  


    BI Developer and lover of data (Blog | Twitter)

    Friday, August 16, 2013 5:45 PM
  • Here are the entries from msmdsrv.ini ... 

    <Memory>
      <MemoryHeapType>2</MemoryHeapType>
      <HeapTypeForObjects>0</HeapTypeForObjects>
      <HardMemoryLimit>0</HardMemoryLimit>
      <TotalMemoryLimit>80</TotalMemoryLimit>
      <LowMemoryLimit>65</LowMemoryLimit>
      <MidMemoryPrice>10</MidMemoryPrice>
      <HighMemoryPrice>1000</HighMemoryPrice>
      <VirtualMemoryLimit>80</VirtualMemoryLimit>
      <SessionMemoryLimit>50</SessionMemoryLimit>
      <MinimumAllocatedMemory>25</MinimumAllocatedMemory>
      <WaitCountIfHighMemory>10</WaitCountIfHighMemory>
      <DefaultPagesCountToReuse>2</DefaultPagesCountToReuse>
      <HandleIA64AlignmentFaults>0</HandleIA64AlignmentFaults>
      <PreAllocate>0</PreAllocate>
      <PagePoolRestrictNumaNode>0</PagePoolRestrictNumaNode>
     </Memory>

    Friday, August 16, 2013 5:50 PM
  • looks like default settings...

    how big is your ssas database(s)?

    also, check out this blog post.


    BI Developer and lover of data (Blog | Twitter)

    • Proposed as answer by Darren GosbellMVP Saturday, August 17, 2013 11:29 AM
    • Marked as answer by Elvis Long Monday, August 26, 2013 4:40 AM
    Friday, August 16, 2013 6:14 PM
  • Primary cube is 25g in size ... have about 200 users.

    Friday, August 16, 2013 6:18 PM
  • are there any other SSAS databases on there or just one database with one 25GB cube?

    if you're getting 20GB of a 25GB cube in memory (give or take for connection/query processing overhead), I think you're probably doing alright.  Check out that blog post I linked above.



    BI Developer and lover of data (Blog | Twitter)

    Friday, August 16, 2013 6:22 PM
  • thank you kindly ... it makes a certain sense, tho I was under the impression that with 200 users, requesting different result sets that the memory usage would be rather larger than the cube size.

    Friday, August 16, 2013 6:50 PM