# Finding the average/min/max of measure of items grouped by date

### Question

• I have an issue with MDX that I really can't get my head around, even after looking at countless examples online.
What I need to do is get the Average (and min/max, but figure once I get one the others will be easy) quantity for all items on a given day.
I have two measures, one measure "Quantity" is set to the quantity of an item for a particular day, it is a semi-additive measure LastNonEmpty.  The other is "LastQuantity" and is Calculation that gets the Last ever quantity using calculations (http://cwebbbi.wordpress.com/2011/03/24/last-ever-non-empty-a-new-fast-mdx-approach/).

To give example of current data the following MDX produces the result below it.

```WITH
SELECT {
[Measures].[Quantity],
[Measures].[LastQuantity]
} ON COLUMNS,
{ [Time].[Date].[Date] * [Item].[Item].[Item] } ON ROWS
FROM ( SELECT ( {{ ASSP.KeysStrToSet("[Item].[Item]", "1619970,1693546,1597279") }} ) ON COLUMNS
FROM ( SELECT ( { [Time].[Date].&[2013-05-06T00:00:00]:[Time].[Date].&[2013-05-07T00:00:00] } ) ON COLUMNS
FROM [CUBE]
))```

Result:
Quantity LastQuantity
Monday, May 06 2013 ITEM1  (null)        20
Monday, May 06 2013 ITEM2   20           20
Monday, May 06 2013 ITEM3   20          20
Tuesday, May 07 2013 ITEM1  (null)       20
Tuesday, May 07 2013 ITEM2  (null)      20
Tuesday, May 07 2013 ITEM3  (null)       20

However when I try and group up the items for particular days it adds together (I assume because they are "semi-additive"):

```WITH
SELECT {
[Measures].[Quantity],
[Measures].[LastQuantity]
} ON COLUMNS,
{ [Time].[Date].[Date] } ON ROWS
FROM ( SELECT ( {{ ASSP.KeysStrToSet("[Item].[Item]", "1619970,1693546,1597279") }} ) ON COLUMNS
FROM ( SELECT ( { [Time].[Date].&[2013-05-06T00:00:00]:[Time].[Date].&[2013-05-07T00:00:00] } ) ON COLUMNS
FROM [CUBE]
))```

Result:
Quantity LastQuantity
May 06 2013   40          40
May 07 2013  (null)        40

What I want to have is the average LastQuantity across all items (that don't have LastQuantity null), so in this case it should be 20 for each day.
I have tried a large variety of queries online and then following have some examples:

```WITH
MEMBER [AverageQuantityTest1] AS ([Measures].[LastQuantity] / [Measures].[TradingFactCount])
MEMBER [CountTest] AS { [Item].[Item] }.COUNT
MEMBER [AverageQuantityTest2] AS [Measures].[LastQuantity] / [CountTest]
SELECT {
[Measures].[Quantity],
[Measures].[LastQuantity],
--AverageQuantityTest1 doesn't work as TradingFactCount doesn't count when Quantity is null but LastQuantity is not null.
[AverageQuantityTest1],
--CountTest doesn't have create count of children (so AverageQuantityTest2 is wrong)
[CountTest],
[AverageQuantityTest2]
} ON COLUMNS,
{ [Time].[Date].[Date]  } ON ROWS
FROM ( SELECT ( {{ ASSP.KeysStrToSet("[Item].[Item]", "1619970,1693546,1597279") }} ) ON COLUMNS
FROM ( SELECT ( { [Time].[Date].&[2013-05-06T00:00:00]:[Time].[Date].&[2013-05-07T00:00:00] } ) ON COLUMNS
FROM [CUBE]
))```

Result:
Quantity LastQuantity TradingFactCount AverageQuantityTest1 CountTest AverageQuantityTest2
May 06 2013   40           40               3                       13.3333333333333     1              40
May 07 2013    (null)       40              2                       20                             1               40

I guess to help me find a solution what I need to know is how I find the true item count for those with LastQuantity populated for each day, so I can divide the LastQuantity by that; but I am not sure.  Perhaps there is another more obvious way?

Any help much appreciated.

• Edited by Tuesday, August 13, 2013 8:51 AM
Tuesday, August 13, 2013 8:51 AM

### All replies

• Anyone able to help with this?  Really need to find out even some hints on where to start looking for this.  Am really stuck :(
Friday, August 16, 2013 10:34 AM
• I have done further work on this -

using the LastQuantity above got the correct data when looking at individual items but didn't appear correct when doing it per date.
This had caused me much confusion but I watched a video conference Chris Webb did about "Fun with Scopes" (should be compulsory viewing), which helped clear things up.
I believe it doesn't work because the calculations for above are being applied after aggregation (is that correct?).

To solve this I used an actual measure in which I populate (using another of your examples):

SCOPE(Measures.LastQuantity);
SCOPE([Time].[Date].[Date].MEMBERS);
SCOPE([Item].[Item].[Item].MEMBERS);
THIS = IIF(ISEMPTY(Measures.MaxQuantityDate), NULL, ([Measures].[Quantity], [Time].[Date].[Date].MEMBERS.ITEM(Measures.MaxQuantityDate)));
END SCOPE;
END SCOPE;
END SCOPE;

This looks to aggregate much better now, however it is incredibly slowing when doing all items across a date range greater than a day or so (I guess the permutations get rather large).

I then changed it so in the SSIS I calculated the quantity in stock per day for every item, so that the cube then has the correct quantity per day.
For one year this is about 80million data points.
Now when querying across a large date range its really quick, few seconds.

However is this an efficient way of doing it?

Tuesday, August 20, 2013 12:06 PM