0

I have been asked by our IT manager to construct an SSAS environment where the same data is stored at different granularities depending on how old it is.

For example, we have a point system determined by how many and what type of projects an engineer completes. We would want to store and process this data at a granularity of one week for points earned in the last month, one month for points earned in the last quarter, and one quarter for points earned in the last three years. The idea being that this will reduce the workload involved in processing the cube.

Is this even possible inside a single cube? My research suggests that it doesn't seem to be. The head of our IT department says that at his previous company they were able to do something along those lines, but that could be a misinterpretation of what was going on behind the scenes as he is not overly technical. This is the first cube our company has built for production, and none of us have done it before, but I am interested in the subject for the development of my young career and am taking the lead. My thought is that if it is not possible inside a single cube, then maintaining multiple cubes will actually cause more work for us developers and more work for the server as well. Not to mention additional complexity for the users.

So, is it possible inside a single cube? If not, would there be any advantages to implementing it with multiple cubes, or should we just forget about it (not the answer I would like to come back with, but it is what it is)? We currently run 2008R2 but are upgrading soon to 2014, is there any difference between the multidimensional and tabular models in this respect?

Josh Gallagher
  • 5,211
  • 2
  • 33
  • 60

2 Answers2

1

I think it's possible--have you considered SSAS Tabular? http://www.daxpatterns.com/handling-different-granularities/

-Regards MikeB

0

You could do something along the following lines:

Assuming a time hierarchy of year-quarter-month-week (and ignoring the slightly unrelated issue of defining the exact rules how weeks aggregate into months), you could set up a time hierarchy as follows:

    week       month   quarter  year
(2012)      (2012)     (2012)   2012
(Q1/2013)   (Q1/2013)  Q1/2013  2013
(Q2/2013)   (Q2/2013)  Q2/2013  2013
(Q3/2013)   (Q3/2013)  Q3/2013  2013
(Q4/2013)   (Q4/2013)  Q4/2013  2013
(Jan 2014)  Jan 2014   Q1/2014  2014
(Feb 2014)  Feb 2014   Q1/2014  2014
(Mar 2014)  Mar 2014   Q1/2014  2014
(Apr 2014)  Apr 2014   Q2/2014  2014
w18/2014    May 2014   Q2/2014  2014
w19/2014    May 2014   Q2/2014  2014
w20/2014    May 2014   Q2/2014  2014

i. e. you just build artificial week, month, and quarter members that are not as detailed as the name of the column suggests (I used the name of the period they match in parentheses to denote that.

However, I would not assume that the cube is that large that this really should be an issue (I cannot imagine you will have hundreds of millions of developers). Hence, I would suggest to just keep one common granularity, e. g. "week", as this improves usability a lot. If you have issues with processing time, I would not think they are due to the size of a cube that contains data about developers.

FrankPl
  • 13,205
  • 2
  • 14
  • 40
  • I think this will work. Essentially, if I understand you correctly, as part of updating the data warehouse underlying the cube, ensure that the date dimension only has one day for each week in the last month, one day for each month in the last quarter, and one day for each quarter in the last three years. Meanwhile, update the dates in the tables joined to the date dimension to all match to that particular date (like the first day of that period). – user3266693 May 07 '14 at 17:28
  • Or maybe just update the other dates and leave the date dimension itself alone. Combined with Mike B.'s tip, that should result in some nice presentation. I kind of misspoke earlier, b/c I would like to store the actual date, just aggregate it at a higher level, but this will serve as the actual date will still be stored elsewhere. – user3266693 May 07 '14 at 17:30