none
way to get beyond 4GB string store limit?

    Question

  • I swear I heard some speaker at the BI conference (Cristian? I can't remember) say that there was some connection string setting or something (maybe an msmdsrv.ini setting) that would allow you to get beyond the 4GB limit on string stores. He said to ask about it in the forums. Did I hear incorrectly? Or is there such a setting?
    Monday, May 21, 2007 10:45 PM
    Moderator

Answers

All replies

  • There are two ways a user can run into the 4GB string store limit: 1) exceeding the limit in one of the temporary string stores created during processing; 2) exceeding the limit in one of the final string stores, such as attribute key/name string stores.  In the first case, there is a server configuration property, PropertyBufferRecordLimitEnabled, inside msmdsrv.ini which, when set to 1, can reduce the maximum number of records inside a temporary processing buffer, hence indirectly limiting the number of strings added to the temporary string store associated with the temporary buffer.  That may help when an attribute has a lot of directly related attributes which have string keys.  But if your problem is not the same, then the configuration property doesn't help and I am not aware of any other configuration properties that can address the 4GB limit of a single string store.
    Tuesday, May 22, 2007 1:54 AM
  • I recall asking this question at the BI Conference (not sure if this was at the same session as where Greg heard it). The scenario which was described in a number of posts (can't find any of them right now!) was of a large MOLAP fact dimension, created to support drillthrough. So would this translate to limits on the number of rows in a MOLAP fact dimension, beyond which one would have to resort to ROLAP?
    Tuesday, May 22, 2007 4:27 AM
    Moderator
  • Jeffrey-

     

    Concerning the 4GB size of the final string store, which file extensions are the ones we should be worrying about? Just .ksstore, .asstore, .bsstore, and .string.data?

    Wednesday, July 25, 2007 4:23 PM
    Moderator
  •  

    Yes, those files are subject to the 4GB limit.

     

    Jeffrey

    Thursday, August 16, 2007 10:44 PM
  • Page 341 of the Melomed/Gorbach/Berger/Bateman book says that the limit is 8GB. I checked with the authors and they confirmed that 8GB was the limit (which, with each char being unicode means 4 gig characters).

     

    Just wanted to confirm with you. Jeffrey, that the above statement matches your understanding.

     

    Monday, February 11, 2008 10:35 PM
    Moderator
  • My two cents for 4 GB string limit:

    For large dim we can move some string attributes to other named queries and create relation 1-1 with main dimension, SSAS will create many store sting files. But if only one attribute exceeds 4 GB it’s crazy idea but we can use CLR function to compress string column. In dim we create named calculation to decompress data J

    Thursday, February 14, 2008 12:27 PM
  • Is there a way to check the current size of the store sting files via T-sql or SSIS?!

    I would like to setup a job that checks the sizes off these files telling my when to do a full process on my cube before it reach the 4 GB limit.  

     

    Friday, November 21, 2008 2:15 PM
  • I believe the info above about the limit being 8GB is incorrect. I have it on good authority (Akshai Mirchandani from the product team) who believes the limit is 4GB (which means 2 gig unicode characters). If anybody has tested this and can confirm that's the real limit of the file size on disk, then please post back.
    http://artisconsulting.com/Blog/GregGalloway
    Wednesday, April 21, 2010 5:20 PM
    Moderator
  • Hi,

    I have a cube dimension with 29Million members and one column with varchar(900) datatype. I'm using SSAS 2008 on 64 bit server with 32GB memory and this dimension has started hitting 4GB string store limit on this column. So I can confirm that it's a real limit on .asstore, .ksstore etc files. I observed these files growing from 0 to 3.97GB and after that processing failed with same old File system error: A FileStore error from WriteFile occurred error....

    I tried

    - changing PropertyBufferRecordLimitEnabled to 1 but didn't work as my case falls under second category in the Jeffery's answer

    - changed the default byattribute processing option to bytable but didn't work as it's hitting 4GB limit with the one attribute only

    - ROLAP dimension is not a solution and I have SP1 for SQL 2008, so I think it should include any patch by microsoft if they fixed this in SQL 2005

    Anybody knows solution on this. I know it's a very old topic but I'm not sure why there is this 4GB limit.

    Any help appreciated,

    Amey

     

     

    Thursday, April 29, 2010 5:53 PM
  • The only solution I know of is to convert the dimension to ROLAP. This is a very frustrating issue with very large dimensions in SSAS.
    Friday, July 16, 2010 5:47 PM
  • Is this still an issue? I'm using SQL Server 2008 R2 and am getting the same type of error. Has a workaround ever been discovered?
    Tuesday, February 08, 2011 2:59 PM
  • I have been told that this issue is going to be fixed in the next release of SQL Server. It really is frustrating because I always have to turn off drill through actions when I use ROLAP dimensions.
    Tuesday, February 08, 2011 3:10 PM
  • ROLAP dimensions are not an option. We have data that gets changed throughout the day and the cube needs to be reprocessed every night, making sure to check foreign key attributes that may have changed from the day before.

    The only answer I can come up with is breaking this one very large cube into smaller units based on our clients (which would be hundreds of little cubes). Is that feasible?

    Tuesday, February 08, 2011 4:13 PM
  • Hi All,

    Any way to resolve this limitation in SQL Server 2011 denali CTP2?

    Regards,

    Gopal

     


    Business intelligence
    Tuesday, July 12, 2011 7:45 AM
  • Gopal-

    If you mean CTP3 of Denali (the forthcoming version of SQL Server) which was just publically released, then the answer is yes. You have to set a setting on the dimension or distinct count measure group to enable it, though:
    http://technet.microsoft.com/en-us/library/gg471589(SQL.110).aspx


    http://artisconsulting.com/Blogs/GregGalloway
    Friday, July 22, 2011 11:41 AM
    Moderator
  • Hello,

    This issue and corresponding error [File system error: While attempting to read information from disk, a read error occurred for physical file:......] recently appeared during the processing of a large dimension after I changed the Memory\LowMemoryLimit and Memory\TotalMemoryLimit properties for SSAS to 25 and 30, respectively. The default values were 65 and 80.  It was necessary to make the change because of memory contention with the primary SQL Server 2008 R2 instance.

    Does anyone know if there is a correlation? If so, could someone please suggest an explanation and/or workaround?

    Many thanks,

    Ben Lezin


    Ben Lezin

    Tuesday, February 21, 2012 6:18 PM
  • If the attribute in question is just the integer surrogate key of a large dimension and users don't actually need to view that column in your cube, another workaround (in addition to upgrading to AS2012 as mentioned above) is mentioned here:
    http://social.msdn.microsoft.com/Forums/en-US/sqlanalysisservices/thread/cea941d6-74c0-4538-9463-d2fd9e168e02

    http://artisconsulting.com/Blogs/GregGalloway

    Monday, July 23, 2012 6:24 PM
    Moderator