none
Denali SSAS StringStoreCompatibility level 1100 is not working for Denali CTP3

    Question


  • Hi Folks,

    Denali based StringStoreCompatibility Level=1100 is not able to bifurcate string storage limit of 4GB.
    My data is size is 160 Millions with 30 attributes.
    Are there any other settings required to set in order to make this new compatibility mode to be workable?
    Or this functionality is not present right now in Denali CTP3?
    Or is there any hot fixes available to bifurcate 4 GB string store limit in SQL Server 2008 R2?

    Base thread for your reference: http://social.technet.microsoft.com/Forums/en-US/sqlanalysisservices/thread/42d1c7cd-8d22-4850-8716-728956460105

    Kindly suggest…


    Thanks in advance,
    Gopal


    Business intelligence
    Tuesday, October 11, 2011 5:43 AM

Answers

  • Hi Karan,

    Thanks for the reply, yes Haidong shared the root cause of this issue, and would be fixing it as hotfix for 2008 R2.


    fact.data. Logical file corrupt issue, is not related to stringstorecompatibility level 1100

    Thanks,

    Gopal

     


    Business intelligence
    Thursday, October 20, 2011 10:39 AM

All replies

  • Did you also change the compatibility level of your database? see: http://raphaelmsbithoughts.wordpress.com/2011/07/22/denali-ctp3-analysis-services-new-feature-scalable-string-storage/
    http://darren.gosbell.com - please mark correct answers
    Wednesday, October 12, 2011 5:31 AM
    Moderator
  • Thanks for reply Darren,

    In this case I had processed single dimension without any hierarchies, facts and measure groups,

    And the compatibility level is already set at dimension level as well as Database level.

    Here is script of my database with compatibility set to 1100 for your perusal, am I missing something.

    This project is very critical and has excellent scope, but we are blocked with this limitation and even not able to breakdown the source data as the users required enterprise level reporting.

    Any help much appreciated...

    <Dimensions>
      <Dimension>       
     <ddl300:StringStoresCompatibilityLevel>1100</ddl300:StringStoresCompatibilityLevel>
      </Dimension>
      </Dimensions>

     <Database>
    <ddl200:CompatibilityLevel>1100</ddl200:CompatibilityLevel>
    </Database>

    Thanks and Regards,

     Gopal

     

    Wednesday, October 12, 2011 6:32 AM
  • Can you post the error message that you are getting?
    http://darren.gosbell.com - please mark correct answers
    Wednesday, October 12, 2011 6:56 AM
    Moderator
  • Error Message:

     File system error: The following file is corrupted: Physical file: \\?\C:\Program Files\Microsoft SQL Server\MSAS11.MSSQLSERVER\OLAP\Data\TestDB.0.db\TestDim.0.dim\3.ORDER ID.fact.data. Logical file.

    I got the same error while processing this dimension using SQL Server 2008 R2 Sp1 also.

    Thanks,

    Gopal


    Business intelligence
    Wednesday, October 12, 2011 7:18 AM
  •  > I got the same error while processing this dimension using SQL Server 2008 R2 Sp1 also.

    Then this error may not be related to the compatability setting.

    The .data file is a partition file. So it's hard to be sure if it's directly related to this setting. It would be worth logging this issue and the steps to reproduce it at http://connect.microsoft.com/sql

     


    http://darren.gosbell.com - please mark correct answers

    Thursday, October 13, 2011 5:38 AM
    Moderator
  • Hi Gopal,

    Did you try to clean up the data directory of TestDB.0.db and redeploy the solution?

    thanks,
    Jerry

    Friday, October 14, 2011 7:53 AM
    Moderator
  • Yes i have done several times, but doesn't helped
    Business intelligence
    Friday, October 14, 2011 9:38 AM
  • Hi Gopal!

    Carefully reading your post I noticed something. In the base thread you are mentioning that Denali since it is not in RTM is a no go for your Customer. So are you trying to set a compatibility level value of 1100 on a SQLServer 2008R2 version of SQLServer ?

    Is this is the case, there is no chance that it will work.

    If not, can you precise the operating system version and the kind of file system type (NTFS, etc...) you're using on your server ?

    Regards,

    Raphael

     

    Monday, October 17, 2011 10:16 AM
  • Thanks Raphael,

    I am testing the compatibility level 1100 on Denali only, not on SQL Server 2008 R2, just asking whether it is possible to bring 1100 compatibility mode to 2008 R2 as hotfix? i know its difficult to bring as it required lot of code to be moved from denali to 2008 R2.

    My machine OS details for your perusal.

    OS: 64 bit Windows 2008 R2 Enterprise

    File system: NTFS

    RAM: 16 GB

    HD:465 GB, Free space=192 GB and dimension relational DB size is 94 GB

    Thanks for help!

    Gopal

     


    Business intelligence
    Monday, October 17, 2011 1:05 PM
  • Hi Gopal,

    I doubt that you will ever be able to bring back the new string scalable storage feature to a previous version of SQLServer ...

    Even opening an MS case, they surely will forward you to your local retailer to by a SQLServer 2012 licence ^^

    You were mentioning that Denali is not an option, did you know that it has been annouced last week that it will go rtm by H1 2012 ?

    I was asking for the server configuration to see if any other file size limitation except the one for SSAS may apply, but it seems that's not the case.

     

    Monday, October 17, 2011 1:35 PM
  • Thanks for updating Denali RTM release date, we would be splitting the data to more chunks to build smaller cubes, till we get Denali RTM, but not sure on this, how does it help for enterprise analysis

     

    Note: Is there any way to create virtual cubes in SSAS as Cognos/SAP virtual cubes, they have abilities to merge child cubes with master cubes with same granularity of data. I was exploring on measure groups concepts in SSAS but that doesn’t help for virtual cubes scenarios.

    I heard that we have virtual cubes concepts in SSAS 2000, could you please share some info on it, and why Microsoft eliminated that concept in subsequent released 2005/2008 etc

     

    Thanks and regards,

    Gopal

     


    Business intelligence
    Tuesday, October 18, 2011 1:32 PM
  • Hi Gopal,

    There was indeed a concept of virtual cube in SSAS2000. The way it works back then, was that you could only have one fact table by cube and the virtual cube mechanism was there to allow ones to "merge" data coming from several fact table into one place, almost like in a view in relational database.

    Starting with SQLServer 2005, SSAS allow you to handle several fact tables in one cube, so the virtual cube concept became useless. Actually, the way it works now enables more funtional scenarios than in the past as you can, among other features, specify the relations between the fact tables (regular, many to many, etc...) through the dimensions.

    However, if required, you can still have some sort of virtual cubes by linking measure groups from other cubes. Doing so, you do not duplicate the storage required for your data but it came with performance cost (performance tests shoul be done here). I do not know how Cognos is working, I hope this will help you however.

    Best regards,

    Raphael

    Wednesday, October 19, 2011 7:48 AM
  • Dont think its releated to string store its something else which is causing an issue. I think we have already invovled Product Team in this case.

    I think Haidong will have more updates for you.

    Karan

     

    Wednesday, October 19, 2011 10:30 AM
  • Hi Karan,

    Thanks for the reply, yes Haidong shared the root cause of this issue, and would be fixing it as hotfix for 2008 R2.


    fact.data. Logical file corrupt issue, is not related to stringstorecompatibility level 1100

    Thanks,

    Gopal

     


    Business intelligence
    Thursday, October 20, 2011 10:39 AM

  • StringStoreCompatibility=1100 is working absolutely fine in Denali CTP3 Version.

    The Issue which blocking the compatibility mode 1100 to work is, no of non key attributes depends on Key Attribute.

    So reduce the no of dependent,  non key attributes on key Attribute by maintaining attribute relationships properly, that will help to avoid the following error.

     File system error: The following file is corrupted: Physical file: \\?\C:\Program Files\Microsoft SQL Server\MSAS11.MSSQLSERVER\OLAP\Data\TestDB.0.db\TestDim.0.dim\3.ORDER ID.fact.data. Logical file. 

    I had successfully Processed 172 Millions of records with 48 attributes in a single dimension.

    SQL Server Denali is great…

    But waiting for RTM version, if anybody has any idea about Denali RTM release, please share with us.

    Note: So Microsoft needs to provide a fix on limitation of no of non key attributes depends on Key attributes.

     

    Thanks,

    Gopal


    Business intelligence

    Tuesday, December 06, 2011 12:12 PM
  • Gopal thats great thanks sharing your feedback. I will try to repro the issue and see we can provide any repro to product team.

     

    Thanks again...for your time...and sharing this...stay in touch.

    Hope you like Denali Virtual labs.

    Karan

    Tuesday, December 06, 2011 5:17 PM