none
processing measure group : memory error : the operation cannot be completed because the memory quota estimate exceeds the available system memory

    Question

  • Hi,

    I'm stucked with this problem. Untill last week, the cube processed without any problem. Since last week, I'm getting this error. I have been searching in different forums, and I tried some suggestions, like changing memory limit properties, ... It is getting worse.. So I reset all properties to default again.

    I am running SQL-Server + MS-AS 2005 SP2 on server with 4GB of memory. This is a dedicated server, nothing else is running on it.

    The fact table has +/- 14 million records, several dimensions en 2 measure groups. I don't have problems to process the dimensions, but when I try to process the cube or the measure groups of that cube separately , the error persists. I have changed the datasource view, and replaced the fact table by a Named query. Even when I put a 'WHERE datapart( year , fact_date ) >= 2009 ' clause to reduce the number of records to +/- 5 million, I'm still getting the error.

    I don't understand what is wrong, the cube always processed since +/- 2 years.

    As I said, I have found a lot of this kind of Issues on different websites, I have been trying to change some properties. But this still does not solve the problem.

    Could it be that MS-AS settings are corrupt somewhere ? Is it a good idea to re-install MS-AS 2005 + SP1 + SP2 ? Or is there another reason possible ?

    I really appreciate any kind of help, because I'm stucked since 5 days with this problem...

    Thanks a Lot

    Frank

     

      

     

    Tuesday, August 31, 2010 3:12 PM

Answers

  • Hi Frank,

     

    I just checked a box with AS 2005 and msmdsrv.ini is available on <drive>\Program Files\Microsoft SQL Server\MSSQL.2\OLAP\Config and the .ini file has Preallocate 

    Are you also considering the Performance counters earlier mentioned which have helped me in the past instead of making changes to Preallocate 

    MSAS 2005:Memory / Memory Usage KB

    Memory \ Available KBytes

    MSAS 2005:Memory / Memory Limit High KB

    MSAS 2005:Memory / Memory Limit Low KB

    Memory \ Cache Bytes

    Memory \ Cache Bytes Peak 

     

    Regards 'Dotun

    Wednesday, September 1, 2010 12:55 PM

All replies

  • Hi Frank,

     

    Seems you may need to focus on the Preallocate  settings based on the error message. I found this blog that might be helpful http://blogs.msdn.com/b/karang/archive/2009/11/01/analysis-services-preallocate-memory-setting-insight.aspx 

    Do you get the same error if you reduce the fact table records to like 10,000 using another table just to test if the error has been caused by increase in Fact records over the years.

    Regards, 'Dotun.

    Tuesday, August 31, 2010 4:19 PM
  • Hello Dotun

    Thanks for your reply.

    Where do I set the Preallocate memory property ?

    Thanks

    Frank

     

    Wednesday, September 1, 2010 7:13 AM
  • Hi Frank,

    You need to go to msmdsrv.ini file located at <drive>\SQL Server\MSAS10.MSSQLSERVER\OLAP\Config to edit Preallocate property.

     

    <Memory>

    <MemoryHeapType>1</MemoryHeapType>

    <HeapTypeForObjects>1</HeapTypeForObjects>

    <HardMemoryLimit>0</HardMemoryLimit>

    <TotalMemoryLimit>80</TotalMemoryLimit>

    <LowMemoryLimit>65</LowMemoryLimit>

    <MidMemoryPrice>10</MidMemoryPrice>

    <HighMemoryPrice>1000</HighMemoryPrice>

    <VirtualMemoryLimit>80</VirtualMemoryLimit>

    <SessionMemoryLimit>50</SessionMemoryLimit>

    <MinimumAllocatedMemory>25</MinimumAllocatedMemory>

    <WaitCountIfHighMemory>10</WaitCountIfHighMemory>

    <DefaultPagesCountToReuse>2</DefaultPagesCountToReuse>

    <HandleIA64AlignmentFaults>0</HandleIA64AlignmentFaults>

    <PreAllocate>0</PreAllocate>

    <PagePoolRestrictNumaNode>0</PagePoolRestrictNumaNode>

    </Memory>

     

    Please ensure you take a backup of your msmdsrv.ini file before you make changes.

     

    Also, it might be worthwhile to analyse the following Performance counters if you havent to have more insight into the status/perfromance of your SSAS server. 

    MSAS 2008:Memory / Memory Usage KB

    Memory \ Available KBytes

    MSAS 2008:Memory / Memory Limit High KB

    MSAS 2008:Memory / Memory Limit Low KB

    Memory \ Cache Bytes

    Memory \ Cache Bytes Peak 

     

    Regards, 'Dotun

    Wednesday, September 1, 2010 9:14 AM
  • Hi Dotun,

    Thanks again for your help, I appreciate

    Maybe I was not clear in my question, but I'm running AS 2005, I can not find the pre-allocation property, neither I have a msmdsrv.ini file on the system.

    Regards

    Frank

     

    Wednesday, September 1, 2010 12:39 PM
  • Hi Frank,

     

    I just checked a box with AS 2005 and msmdsrv.ini is available on <drive>\Program Files\Microsoft SQL Server\MSSQL.2\OLAP\Config and the .ini file has Preallocate 

    Are you also considering the Performance counters earlier mentioned which have helped me in the past instead of making changes to Preallocate 

    MSAS 2005:Memory / Memory Usage KB

    Memory \ Available KBytes

    MSAS 2005:Memory / Memory Limit High KB

    MSAS 2005:Memory / Memory Limit Low KB

    Memory \ Cache Bytes

    Memory \ Cache Bytes Peak 

     

    Regards 'Dotun

    Wednesday, September 1, 2010 12:55 PM
  • Hi Dotun

    Sorry for my late answer, but sometimes I have to test things in 'down time' :)

    Ok, I rebooted the server : no solution

    then I changed the PreAllocate porperty to 2 GB, no solution

    Changed back to 0

    Finally, I re-indexed the datasource ( my datawarehouse, SQL server, maintenance task ) and I launched a full process of the entire MS-AS database ( with limited records >2008 ): unbelievable, but this succeeded without memory error

    Now I will gradually reprocess with more years of data, and see where I get ..

    I think the memory issue is unpredictable..

    Thanks anyway

    Frank

     

     

     

     

    Wednesday, September 1, 2010 6:20 PM
  • Hi All

    Processing worked for a few days, but then the problems were starting over again ( memory errors, processing in an endless loop, ... ). After many days of trial and error, I decided to import the complete database in a new Visual Studio BI project. So, I had the complete structure in a new project. After backing up, I deleted the ms-as database and deployed the new project to the server ( same database name). Then I started to process dimensions and cubes, and everything worked fine. Since a few days now, processing ( run in sql agent ) works fine.

    I hope it will stay this way...

    Next time, when I have problems with processing, I will create a new project again, delete the database and deploy again.

    Off course, I understand, this is not always the right way, but how to do elseway ? I have lost DAYS, trying to do everything I could.. nothing helped.

    thanks

    frank

     

        

    Wednesday, September 22, 2010 6:34 PM
  • This error generally occurs not because you have too many rows of data in the fact table, but rather because the dimensions are too large. AS has to pull the hash tables for the granularity attributes into memory and when they use up a lot of memory, it will end up running low on memory for the processing buffers.

    Another situation is if the aggregations are taking up a lot of memory. You could try to do ProcessData followed by ProcessIndexes, which might help reduce the memory requirements...

    HTH


    -- This posting is provided "AS IS" with no warranties, and confers no rights
    Wednesday, September 22, 2010 7:51 PM