Customer Debts and Collection Analysis in Tabular/DAX - different approach to data analysis... RRS feed

  • Question

  • Hello friends,

    I FEEL that Tabular together with DAX can revolutionaze our approach to debs&collection data analysis... So here is our issue and I would be greatful for any thoughts/ideas:

    Currently in order to analyse the changes in our customers debts we keep "snapshots" of all "Open Rows" (invoice rows that haven't been paid yet) in a Fact table. The table includes fields:

    CustomerID, InvoiceID, Invoice_Date, Due_Date, Debt_Amount, Snapshot_Date, etc...

    When we compare debt amounts for two Snapshot Dates - we see the change in debts...

    The downside of this approach is of course that we have to keep many snapshots of the SAME data - even if nothing changed and a customer haven't paid his debt...

    I think and "feel" that we could use a different approach:

    Instead of snapshots we will work with the original table, whuch will have all the data (including "Closed Rows" - invoices that were paid) + Actual Payment Date (Status_Date) field:

    CustomerID, InvoiceID, Invoice_Date, Due_Date, Debt_Amount, Status (Open/Closed), Status_Date

    This way we are able to "Reverse Engineer" the history and see what was the Debts amount ON A CERTAIN DATE - without having to store history snapshots....(and without having to calculate anything in the ETL flows)

    Untill Tabular/DAX came out - we found it was difficult for us to build our MDX/OLAP based on this table (mainly because we couldn't decide how to connect Time Dimension... There is no relevant FK in the Fact table) - so we have implemented the "Snapshots" approach (so that Snapshot_Date is connected to the Time Dimension). Now, that there's DAX - I feel it can be done directly in Tabular  just using Measures.

    The challenge, I think, is to use the Time Dimension (Dates_Table) that is NOT related to ANY field in the Fact table. And to build a Measure that would compare the TimeDimension "currentmember" date in pivot context with Due_Date and Status_Date of "Closed(Paid) Rows" and if it's "between" the two, it means that on THAT DATE the invoice was not paid (there was a debt). Similarly, "Open Rows" would be checked against the Due Date...

    We haven't tried the new approach yet, and before we do - I would very much appreciate any thought / experience on this issue

    Thank you in advance,

    Michael Shparber


    Thursday, May 9, 2013 2:15 PM


All replies

  • the approach is actually similar to and Slowly Changing Dimension Type 2 just that you are talking about facts
    i blogged about how to handle those kind of dimensions and facts here:

    VALID_FROM/VALID_TO would be your Due_Date/Status_Date


    - www.pmOne.com -

    • Marked as answer by M. Shparber Thursday, May 9, 2013 8:27 PM
    Thursday, May 9, 2013 4:00 PM
  • WOW, Gerhard! This is so cool! Debt and Inventory actually have similar behavior ("balance sheet" behavior)... I couldn't wait for the workweek, so I tried your measure right away and it seems to be working!!! DAX is so powerful! Now I'll have to find out what is the performance like for a billion rows or so... I already have some further ideas for DAX measures for Debt Analytics, such as: For any given date I would like to calculate three different measures: ChangeInDebtFromNewDebtors, ChangeDueToCollection, ChangeFromExistingDebtors - all changes - vs EndOfPreviousMonth. All in DAX. This would save tons of manual analytic work! Thanks a lot for your fast help! Michael Shparber Cellcom Israel


    Thursday, May 9, 2013 8:26 PM
  • Hi Gerhard,

    Just tried it on the sample 35 million rows of data...

    The server was very angry at me...

    Actually the calculation did work when I chose several dates only, but if I choose even one month - the memory gets to 100% (I have 120 GB)

    And 35 million - is just a sample - the whole table is 1 billion rows

    Could there be a more efficient calculation?


    Sunday, May 12, 2013 1:53 AM
  • Hi,

    I've just opened a new thread about this calculation's efficiency


    I have a FEELING :) it could be done much more efficiently




    Sunday, May 12, 2013 5:00 AM