locked
Pitfalls of a single complex PowerPivot model (no SSAS) RRS feed

  • Question

  • We don't use SQL Server, we use Postgresql for our data warehouse and only use PowerPivot as the end-user interface. We have 2 primary concerns 1) responsiveness of model for end users -- it's got to be usable and 2) BI maintenance headaches when you have multple models needing to be updated and maintained.

    We've opted for an optimised single model approach (currently has 4 fact tables and about 10 dimensions but it's likely to grow a bit so fairly complex). This holds all our measures (of which there are very many). Currently there are 10 sheets with one pivot per sheet. As you can imagine, this is very slow. So we've adopted a "shredding" methodology. We maintain this single model but for distribution we shred it into 1-2 closely related sheets per workbook. This has improved performance significantly for end-users and has solved our maintenance problem.

    The main problem now is that for us, BI, working with the master non-shredded file with all the pivots is slow and likely to become slower as time goes by. Any suggestions from others who are *NOT* using the full Microsoft stack (e.g. analysis services) on how to balance model maintainability and final usability? 

    Thanks

    Thursday, June 18, 2015 6:25 AM

Answers

  • Your approach is the only way that I can think of to handle it. I think that's why Microsoft makes Power Pivot free with Excel, because they know that they can make money when people get into your situation.

    I'm curious, do you just deploy the Excel files to a shared windows directory and have users consume them by opening Excel on their local computer?

    • Proposed as answer by Michael Amadi Sunday, June 21, 2015 5:52 PM
    • Marked as answer by Charlie Liao Monday, June 29, 2015 3:22 AM
    Friday, June 19, 2015 3:59 PM
  • Hi Joanna,

    I don't have practical experience with this approach, but how much "Standard Performance optimization" did you already do?: i.e. http://tinylizard.com/power-pivot-performance-gotchas/

    Looks like in your case to turn off cross-slicer-filtering could have a huge impact.

    I'd also consider to filter the measure table during import for development purpuses. Before you roll-out a new version you can remove the filter again.


    Imke

    • Proposed as answer by Michael Amadi Sunday, June 21, 2015 5:52 PM
    • Marked as answer by Charlie Liao Monday, June 29, 2015 3:22 AM
    Sunday, June 21, 2015 5:24 AM
    Answerer

All replies

  • Your approach is the only way that I can think of to handle it. I think that's why Microsoft makes Power Pivot free with Excel, because they know that they can make money when people get into your situation.

    I'm curious, do you just deploy the Excel files to a shared windows directory and have users consume them by opening Excel on their local computer?

    • Proposed as answer by Michael Amadi Sunday, June 21, 2015 5:52 PM
    • Marked as answer by Charlie Liao Monday, June 29, 2015 3:22 AM
    Friday, June 19, 2015 3:59 PM
  • Hi Joanna,

    I don't have practical experience with this approach, but how much "Standard Performance optimization" did you already do?: i.e. http://tinylizard.com/power-pivot-performance-gotchas/

    Looks like in your case to turn off cross-slicer-filtering could have a huge impact.

    I'd also consider to filter the measure table during import for development purpuses. Before you roll-out a new version you can remove the filter again.


    Imke

    • Proposed as answer by Michael Amadi Sunday, June 21, 2015 5:52 PM
    • Marked as answer by Charlie Liao Monday, June 29, 2015 3:22 AM
    Sunday, June 21, 2015 5:24 AM
    Answerer