locked
Updating the PowerPivot table partial RRS feed

  • Question

  • Hi,

    Here is my problem. I have a Powerpivot table with 1 035 748 rows and 65 columns (including 7 calculated columns).

    When I try to update I have the following message :"

    Erreur mémoire : Erreur d'allocation : Espace insuffisant pour traiter cette commande. .

    L'opération a été annulée.

    L'objet hors ligne 'DataSourceView', qui fait référence aux ID 'Temp_DSV', a été spécifié mais n'a pas été utilisé."

     

    Which mean :

    ERROR occurs : The operation has been cancelled due to memory pressure.
    The operation has been cancelled.
    Out of line object ‘DataSourceView’. Referring to ID(s), ‘Temp_DSV’. Has been specified but has not been used.

     

    I tried to update on a more powerfull computer (4Go RAM, Windows 64 bit) and result was still the same.

     

    My question is : could we update only a part a the powerpivot table ? Indeed my database contains datas 2011,2012... but when I update datas, I don't really need to update all of them, only 2012 is sufficient for me.

     

    I think,  if this is possible,  import additional datas or update a part of the global database will no be a problem anymore.

    Thanks for your help,

    David

    Friday, November 23, 2012 10:41 AM

Answers

  • Easier way to get your table metadata.  Check out this blog post from Chris Webb.

    Should be able to run this query and get the column list with datatypes and size pretty easily.

    select * from $SYSTEM.DISCOVER_STORAGE_TABLE_COLUMNS
    order by dimension_name
    Even easier to run from DAX Studio (get it here), and it will add a sheet to your workbook with the results. 

    Brent Greenwood, MS, MCITP, CBIP
    // Please mark correct answers and helpful posts //
    http://brentgreenwood.blogspot.com



    Tuesday, November 27, 2012 4:00 PM
    Answerer

All replies

  • In PowerPivot, the processing is equivalent to a truncate / reload.  So no partial processing for now.  You could accomplish this with partitioning in SSAS Tabular if you have a SQL2012 AS instance you could promote your PowerPivot model to.

    Surprising that a million rows won't process though.  Some other ideas to try to get it to work.  You can trim down your source query to try to get the dataset smaller for your machine to process.  Take a look at all of your columns and get rid of any non-essential ones.  Ideally, remove fields that don't have a lot of repeated values if possible (i.e. columns that you won't use to slice by or calculate on).  Also, if you can push any of your calculated columns down to your source query instead of doing them in DAX, that would be another way to trim the data down and minimize the processing load.

    Hope that helps. 


    Brent Greenwood, MS, MCITP, CBIP
    // Please mark correct answers and helpful posts //
    http://brentgreenwood.blogspot.com


    Friday, November 23, 2012 3:24 PM
    Answerer
  • Hi Brent,

    Thank you for your help, I did some research and find that indeed powerpivot can't reload just a part of a data source.

    I will check my database and see if I can delete some columns but I don't think so...

     
    I'm very surprised too that 1 million rows generate en error because I read web articles which spoke about 150 millions rows !! in powerpivot


    I did updates every 10 days since 1 year (approx 30 000 rows each time) and that's work untill now, I don't understand why the last 30 000 rows on 1 million can't succeed to import.

    If you have any ideas to identify why we have this error please help us.


    Thanks

    Friday, November 23, 2012 4:58 PM
  • In addition to trimming and condensing your source query, I'd suggest adding a "TOP 1000" to your source query for a test.  Just to make sure the real issue is the volume of data. 

    Also, if you need help rewriting your calculated columns into your SQL source query, feel free to post them here and I'd be happy to help. 


    Brent Greenwood, MS, MCITP, CBIP
    // Please mark correct answers and helpful posts //
    http://brentgreenwood.blogspot.com


    Friday, November 23, 2012 6:53 PM
    Answerer
  • I tried to delete some columns and import ( in manual mode for calculation) but the result remain the same

    I tried also to update with mapping (to see progression of import) and saw that 1 040 000 rows were imported, the error message comes just before "data integration". It's seems that volume of rows is not the problem, it's the "integration" of rows in powerpivot.

    For information, my powerpivot is in a Excel workbook and the datasource comes from Access.

    Monday, November 26, 2012 10:46 AM
  • Sounds like your hardward just can't handle the load, but maybe there's more to it.

    Can you share the following?
    - image of the model (is it just the one table?)
    - datatypes, count and cardinality of the columns
    - definition of the calculated columns
    - any other moving parts in the model
    - hardware specs you are trying to process on (32bit?  OS? RAM? CPUs?)

    That info should provide something to work with.  If it's easier to share the workbook with a subset of the data on your SkyDrive, be happy to take a look.

    Also, for the column metadata above, some of that can be gotten from the DMVs.  Here's an XMLA script that provides a dump including columns datatypes. Perhaps you could copy column metadata here, or share the output via SkyDrive.  Just update with your database name for <CATALOG_NAME>

    <Discover xmlns="urn:schemas-microsoft-com:xml-analysis">
     <RequestType>DISCOVER_CSDL_METADATA</RequestType>
     <Restrictions>
      <RestrictionList>
       <CATALOG_NAME>YourDatabaseNameHERE</CATALOG_NAME>
      </RestrictionList>
     </Restrictions>
     <Properties>
      <PropertyList>
       <FORMAT>Tabular</FORMAT>
      </PropertyList>
     </Properties>
    </Discover>


    Brent Greenwood, MS, MCITP, CBIP
    // Please mark correct answers and helpful posts //
    http://brentgreenwood.blogspot.com



    Tuesday, November 27, 2012 2:24 PM
    Answerer
  • Easier way to get your table metadata.  Check out this blog post from Chris Webb.

    Should be able to run this query and get the column list with datatypes and size pretty easily.

    select * from $SYSTEM.DISCOVER_STORAGE_TABLE_COLUMNS
    order by dimension_name
    Even easier to run from DAX Studio (get it here), and it will add a sheet to your workbook with the results. 

    Brent Greenwood, MS, MCITP, CBIP
    // Please mark correct answers and helpful posts //
    http://brentgreenwood.blogspot.com



    Tuesday, November 27, 2012 4:00 PM
    Answerer
  • Ok Brent, I look on this and I come back with more details, many thanks !

    Tuesday, November 27, 2012 4:13 PM
  • Hi David,

    did you solve your issue ? How ?

    I've the same problem. Importing aprox. 3.000.000 rows, the same error message.

    Any idea ?

    Thanks a lot

    Friday, March 4, 2016 2:18 PM