Data refresh in sharePoint fails when the workbook has too much data RRS feed

  • Question

  • Hello, I am running sharePoint 2013 and I have a powerPivot workbook in Excel 2013. If the workbook is around 70 MB then the automated data refresh in sharePoint runs without issues but if the workbook is bigger than that (so far I only tried when I pull around 160 MB) then I get

    Call to Excel Services returned an error

    This is what I see in the logs with respect to that workbook and when it failed

    Process: w3wp.exe (0x1590)

    Area: PowerPivot Service

    Category: Data refresh

    Level: High

    there are 3 messages with the same timestamp for these parameters

    The following exception occured during datarefresh on xxx workbook

    EXCEPTION: System.InvalidOperationException: Call to Excel Services returned an error. ---> Microsoft.AnalysisServices.SPClient.Interfaces.ExcelServicesException: ECS failed with non-zero return status. First error is name='ExternalDataRefreshFailed'; message='We were unable to refresh one or more data connections in this workbook.  The following connections failed to refresh:    ThisWorkbookDataModel  '; severity='Error'     at Microsoft.AnalysisServices.SPClient.ExcelApi.ValidateStatus(Status[] status)     at Microsoft.AnalysisServices.SPClient.ExcelApi.Call[T](String fileUrl, ExcelServiceCall`1 serviceCall)     --- End of inner exception stack trace ---     at Microsoft.AnalysisServices.SPClient.ExcelApi.Call[T](String fileUrl, ExcelServiceCall`1 serviceCall)     at Microsoft.AnalysisSe... 

    ...rvices.SPClient.ExcelApi.Call(String fileUrl, ExcelServiceCall serviceCall, String methodName, Object[] parameters)     at Microsoft.AnalysisServices.SPAddin.DataRefresh.DataRefreshService.ProcessingJob(Object parameters) 

    Any idea what might be causing this?


    • Edited by kizofilax135 Monday, June 9, 2014 5:29 PM editing format
    Monday, June 9, 2014 5:28 PM


  • While I don't know exactly what that error message means, there are a few things you can try to fix an import problem on a model that was previously working fine.

    Workbooks can grow exponentially bigger (10x or more sometimes) than their stored size when being updated (processed) which can run into limits in file size or on the server.

    Some steps you can try if this is memory related:

    1. Get rid of columns you don't actually need in your analysis.  If something isn't directly involved in your calculations or reports, don't import it.  The fewer columns the better.

    2. Don't import unnecessary high cardinality columns.  This can be things like the Primary Key in a large table.  Or multiple columns that have unique values for each row.

    3. Reduce the number of calculated columns.  Complex calculated columns are one of the main culprits with eating up memory during processing.  You can either try to write a more efficient formula in your Calculated Column or better yet, create the Calculated Column in your underlying data source so you can simply import the field without having Power Pivot try to calculate it.

    4. This last one can make a huge memory difference.  Round down numeric values on import.  I had a model with about 4 million rows in the fact table.  One of the columns had numbers with sometimes up to 8 or 9 decimals.  I rounded it to 2 decimals in my import query and the total file size was less than 1/3 of the original.  Makes a huge difference in Power Pivot compression which is tied my 2nd suggestion above.

    Another issue might be a problem in your underlying data that doesn't show up with the smaller data set.  For example, if one of your fields is used in a relationship and a single row has a null value in that field, you will get an error.  Also, I have seen errors when Power Pivot is expecting one format in a field but a row comes in with another.  For example, if Power Pivot is formatted for a numeric field because the smaller data set only had numbers in that field, then you expand to a larger data set where a single row has text for some reason, that could be a problem.  1 row out of tens of millions could cause a problem in these situations. 

    Thursday, June 12, 2014 10:24 PM