locked
ProcessAdd file system error RRS feed

  • Question

  • Occasionally, when attempting to process a cube dimension on SSAS 2008 using ProcessAdd the following error is returned:

    <Error ErrorCode="3238133838" Description="File system error: While attempting to read information from disk, a read error occurred for physical file: \\?\M:\MountPoints\CubeData\16\SSAS_TempDir\APIDW\MSMDCacheRowsetBadRows_1708_22838_xhw0o.tmp, logical file: ." Source="Microsoft SQL Server 2008 Analysis Services" HelpFile="" />

    This dimension is pretty big with around 10 million records. Any idea what could be causing this?
    We are running Windows Server 2003 datacenter edition.

    Wednesday, March 18, 2009 3:48 PM

Answers

  • Have you found any type of resolution for this issue?  I am currently experiencing this with a dimension that has around 15.5MM rows and it errors out when processing the Key column with:

    File system error: While attempting to read information from disk, a read error occurred for physical file: \\?\T:\Program Files\Microsoft SQL Server\MSAS10.MSSQLSERVER2\OLAP\Temp\MSMDCacheRowsetBadRows_6792_93_7jfq1.tmp, logical file: . Errors in the OLAP storage engine: An error occurred while the 'Dimension Key' attribute of the 'Dimension' dimension from the 'SSAS' database was being processed.

    I bumped up the ExternalCommandTimeout property for the SSAS instance to see if that would resolve it, but it still errors out at the exact spot.  I have tried ProcessFull and that does not work.  As of right now we are unable to process the dimension at all.

    I was looking at this posting, but nothing seems to be related from what I can tell and the dimension is not at the 4GB limit for the attribute being processed.

    HELP! I cannot process a large dimension.

    We are using SQL Server 2008 Enterprise Edition on Windows Server 2008 and we are on CU3 with SSAS.

    Dan English's BI Blog
    _____________________________________________________
    Please mark posts as answer or helpful when they are.
    Wednesday, April 29, 2009 5:50 PM
  • We resolved the issue by setting the ProcessingGroup property of the dimension to ByTable instead of ByAttribute.  We have 32 GB of RAM on the server and are not running into any memory issues.  Once this setting was changed we were able to process the dimension without any issues and saw improvements in processing time by about 20% improvement.

    More information on this setting can be found here:


    Using ProcessingGroup Dimension property option ByTable vs. ByAttribute may error with string keys

    Using ByAttribute or ByTable Processing Group Property with Analysis Services 2005
    Dan English's BI Blog
    _____________________________________________________
    Please mark posts as answer or helpful when they are.
    Thursday, April 30, 2009 3:10 PM
  • hi, dan, from what i learn, bytable option consumes more memory than byattribute, so if you have dimension which contains lots of string attributes, i guess bytable is not nice option. and you said your server is 32GB, actually, if you have a large dimension, it will takes up lots of memory if you're using bytable option, and 32GB if obviously not large enough.

    and to angel_ny, did you specify the datasourceview when you use the processadd? if not, i guess you just double the members of that dimension, so i guess the size is doubled too.


    My Tech blog: http://www.imkevinyang.com/
    Tuesday, April 13, 2010 2:18 AM
  • Amey,

    CU8 is realeased after SP1. I am sure about that. You can look at the Analysis services version from SSMS.  Currently we are running CU9 and version is 10.0.2789.0

    Cheers,

    Nagy

    Tuesday, September 21, 2010 8:15 PM
  •  

    Thanks Nagy. It worked after installing CU8.

    Monday, September 27, 2010 8:52 PM
  • Friday, February 11, 2011 12:20 PM

All replies

  • We are getting the same error since testing 2008 on our development server:
    File system error: While attempting to read information from disk, a read error occurred for physical file: \\?\D:\Microsoft SQL Server\MSAS10.MSSQLSERVER\OLAP\Temp\MSMDCacheRowset_928_1328_063fq.tmp

    Dimension is big ~ 4 Mio entries; Full Process works fine, the ProcessAdd every 2nd time

    We have SSAS 2008 development edition (10.0.1787.0), windows server standard 2008
    WernerC
    Thursday, March 19, 2009 7:55 AM
  • Have you found any type of resolution for this issue?  I am currently experiencing this with a dimension that has around 15.5MM rows and it errors out when processing the Key column with:

    File system error: While attempting to read information from disk, a read error occurred for physical file: \\?\T:\Program Files\Microsoft SQL Server\MSAS10.MSSQLSERVER2\OLAP\Temp\MSMDCacheRowsetBadRows_6792_93_7jfq1.tmp, logical file: . Errors in the OLAP storage engine: An error occurred while the 'Dimension Key' attribute of the 'Dimension' dimension from the 'SSAS' database was being processed.

    I bumped up the ExternalCommandTimeout property for the SSAS instance to see if that would resolve it, but it still errors out at the exact spot.  I have tried ProcessFull and that does not work.  As of right now we are unable to process the dimension at all.

    I was looking at this posting, but nothing seems to be related from what I can tell and the dimension is not at the 4GB limit for the attribute being processed.

    HELP! I cannot process a large dimension.

    We are using SQL Server 2008 Enterprise Edition on Windows Server 2008 and we are on CU3 with SSAS.

    Dan English's BI Blog
    _____________________________________________________
    Please mark posts as answer or helpful when they are.
    Wednesday, April 29, 2009 5:50 PM
  • We resolved the issue by setting the ProcessingGroup property of the dimension to ByTable instead of ByAttribute.  We have 32 GB of RAM on the server and are not running into any memory issues.  Once this setting was changed we were able to process the dimension without any issues and saw improvements in processing time by about 20% improvement.

    More information on this setting can be found here:


    Using ProcessingGroup Dimension property option ByTable vs. ByAttribute may error with string keys

    Using ByAttribute or ByTable Processing Group Property with Analysis Services 2005
    Dan English's BI Blog
    _____________________________________________________
    Please mark posts as answer or helpful when they are.
    Thursday, April 30, 2009 3:10 PM
  • hi, dan, from what i learn, bytable option consumes more memory than byattribute, so if you have dimension which contains lots of string attributes, i guess bytable is not nice option. and you said your server is 32GB, actually, if you have a large dimension, it will takes up lots of memory if you're using bytable option, and 32GB if obviously not large enough.

    and to angel_ny, did you specify the datasourceview when you use the processadd? if not, i guess you just double the members of that dimension, so i guess the size is doubled too.


    My Tech blog: http://www.imkevinyang.com/
    Tuesday, April 13, 2010 2:18 AM
  • Hi Angel-NY, were you able to solve this issue? Did you tried changing ProcessingGroup property and did it worked? or any other solution? I too have a big dimension around 10 million records but it's not hitting 4GB limit so far.

    Tuesday, September 21, 2010 6:29 PM
  • This was a bug. I opened the case with MS and got it resolved. This problem is fixed CU 8 for 2008. As a interim solution I had to change the processing option to Process Update.

    Cheers!

    Nagy

    Tuesday, September 21, 2010 7:11 PM
  • Thanks Nagy. Actually we just installed SQL server 2008 with SP1 on our new servers. So I think the SP1 that we installed was with CU 8. Do you know the way to check which CU the SQL server is running with?
    Tuesday, September 21, 2010 7:51 PM
  • Amey,

    CU8 is realeased after SP1. I am sure about that. You can look at the Analysis services version from SSMS.  Currently we are running CU9 and version is 10.0.2789.0

    Cheers,

    Nagy

    Tuesday, September 21, 2010 8:15 PM
  • Thanks Nagy. I'll look what CU we have on our servers and if it makes difference updating it.
    Tuesday, September 21, 2010 10:19 PM
  •  

    Thanks Nagy. It worked after installing CU8.

    Monday, September 27, 2010 8:52 PM
  • Friday, February 11, 2011 12:20 PM