locked
AS 2005 cube not accessible while processing dimension(Process Add) and processing partitions(process full) RRS feed

  • Question

  • Hello All,

    Background of the isssue:
    We have an AS 2005 and SQL 2000 datawarehouse application that needs to be refreshed during business hours. when a user wants to refresh the cube(contains 6 cubes) data he is allowed to mark certain dimensions (process add) or certain partitions(process full) to process. 

    Issue:
    once the processing job is kicked to process either dimensions or partitions or both with the above mentioned process types... we are able to access 4 cubes out of those 6 cubes but unbale to access the other 2 cubes...we have checked the properties of the cube and nothing is unusual 
    note: calculations in these 2 cubes are relatively high .
    What we dont understand over here is when the dimension being processed and being used by all the 6 cubes of the AS database to our surprise the 4 cubes are accesible except the 2 cubes... and this scenario holds good while partition processing

    Am i missing anything here ? or how to troubleshoot this issue?

    Many thanks for your help in advance...
    Thursday, March 11, 2010 11:49 AM

Answers

  • Note that Analysis Services doesn't have fine grained locking and as such requires a database level lock before it commits a processing transaction. This means that it is possible for a long running query to prevent processing to commit. (See the "ForceCommitTimeout" setting for Analysis Services.) However, I would not have expected the cube to be unavailable during this phase of a transaction. That is odd...

    Based on what you are saying it would appear that Analysis Services is taking a long time to evaluate updated metadata (or something) after processing, causing the behaviour you are seeing. What version (read: service pack / CU) are running and have you checked for fixes in any later SPs/CUs if applicable?

    Have you checked to see if there are any mini-dumps being generated. When Analysis Service causes dump files to be generated, depending on the amount of physical memory on the box, this can cause Analysis Services to hang until the memory has been dumped and thus prevents new connections.
    • Marked as answer by loki_thass Tuesday, March 16, 2010 5:53 PM
    Monday, March 15, 2010 11:09 PM

All replies

  • ".. but unable to access the other 2 cubes .." - what exactly happens? Do you get an error message, a timeout, or does the client tool just hang? In the latter case, what is the client tool - and what are the details of the cube server and of the processing job?
    - Deepak
    Friday, March 12, 2010 3:05 AM
  • I agree with the other poster that more information is required. Based on the current information anything suggested here would be pure conjecture but, with that in mind... I guess an extension to the list of issues why the cube might not be available is that the cube structure may no longer be processed. Certainly in 2008 it is possible to have a cube where all partitions within it are processed but where the cube isn't and as such the cube is not available for querying. So one thing to check is that the cube structures are processed. If the cube has become unprocessed we would at least know the issue but not what caused it. Beyond that, can you create a reduced test case? By that I mean does the same thing happen if the database has almost no data? If not, at what point does it start to happen? If it does, what happens if you remove the 4 working cubes and try and process (assuming no linked measure groups here)? Does it still occur? What happens if you switch to using process update (clearly you will need to alter the data set presented to AS here else you will remove members) does it still occur? (I.e. Is this a function of process add?) Sorry that I can't offer more than that at the moment but hopefully I've given you someway to try and determine what is causing it. Once you know that, you can attempt to fix it.
    • Proposed as answer by Raymond-Lee Monday, March 15, 2010 2:50 AM
    Friday, March 12, 2010 8:09 AM
  • Morning All,

    Hope you people had a great weekend !

    @z1rcon: We have checked the scenario for various processing types except for process full which we understand the cubes will not be available..it behaves same even for other processing types(process update,process data...). we haven't tested out for the case with no data in the database. we have observed once the processing is over the cubes take sometime to show up and we believe this may be due to the cube structure gets refreshed after processing is completed(observed from  'date modified' of the xml files of cubes from windows structure) 

    @Deepak: We currently use a SQL Server Agent job to fire an Analysis Services XML command to process the dimensions and partitions of the SSAS 2005 database.

     Upon investigation of the Flight Recorder logs it appears that the process of the database, the calculation of the aggregates and the notifications have all been completed (in one instance we waited for more than 1/2 hour after the last notification event in the Flight Recorder Log before having to restart analysis services) during this time it is also impossible to open a new connection to the SSAS engine. This led us to restart Analysis Services and when the service restarted it appears that the database has been processed correctly with a Last Processing time approximately the same as the last Progress End event in the Flight Recorder Log.

     At the moment we think it might be related to users accessing the cube via ProClarity while it is being processed and this causing the transition between the old version and new version of the SSAS database to be locked out. This behaviour is appearing only for 2 specific cubes in our SSAS database which rather contains 6 cubes.

    Hope this helps you to get insight about the issue.

    Monday, March 15, 2010 9:46 AM
  • Some Additional Info on the issue:

    Even after the processing job is completed which we confirmed from the SQL Server Agent job status and from log details .. the cube is still not available for few minutes (say 2-3 min) and we can confirm the XML structure of the cubes are not modified after job completion.
    Monday, March 15, 2010 9:59 AM
  • Note that Analysis Services doesn't have fine grained locking and as such requires a database level lock before it commits a processing transaction. This means that it is possible for a long running query to prevent processing to commit. (See the "ForceCommitTimeout" setting for Analysis Services.) However, I would not have expected the cube to be unavailable during this phase of a transaction. That is odd...

    Based on what you are saying it would appear that Analysis Services is taking a long time to evaluate updated metadata (or something) after processing, causing the behaviour you are seeing. What version (read: service pack / CU) are running and have you checked for fixes in any later SPs/CUs if applicable?

    Have you checked to see if there are any mini-dumps being generated. When Analysis Service causes dump files to be generated, depending on the amount of physical memory on the box, this can cause Analysis Services to hang until the memory has been dumped and thus prevents new connections.
    • Marked as answer by loki_thass Tuesday, March 16, 2010 5:53 PM
    Monday, March 15, 2010 11:09 PM
  • @z1rcon: Thanks for the reply !

    We have set the ForceCommitTimeout property to a larger value and because of this setting change we can see the cubes are available but still with delays in accessing the metadata and the cube information.

    We will have a look at the SPs/CUs of the SSAS 2005 pack. We will also have a look at the mini-dumps generated by AS.

    We were performing different test cases to boil down the issue and to our surprise we found that if we remove the cube caluclations(say 90% is removed) the cube is much faster to access but yeah this will not solve the issue or the purpose.

    Could you please let me know how we can track these calculations to identify the root cause or your thoughts on this??
    Tuesday, March 16, 2010 12:30 PM
  • No problem.

    Ah okay, nothing like a reduced test case to make things a little clearer lol. :-) If memory serves, processing a cube (or even a dimension) causes the calculation script to be re-evaluated so yes, that could be the problem right there.

    Tuning the calculation script is probably going to be your best option then, whatever that might mean. As I have no knowledge of the script in question I cannot even surmise what that might entail other than recommend that you follow the information in the AS2005 performance guide, the MDX bottlenecks document etc. (Essentially follow best practice such as using SCOPE where applicable, performing calculations in the correct order etc.)

    What I am not 100% sure about, is what the initial iteration over the calculation script after processing does. (i.e. Does it actually perform calculations or does it just build a calculation tree or... You get the idea.) That might be your first port of call, to try and understand that so that you know what to tune. Got an AS internals book handy? Lol, if you do find out, let me know too please. :-)

    Tuesday, March 16, 2010 1:16 PM
  • Thanks a lot. I haven't got AS internals book handy.. if you have one can u please pass it to me - the soft copy :)

    We are using SSAS 2005 SP3 CU 6 (9.00.4266.00)
     

    Meanwhile, i will have a look at the initial iteration of the cube script after the processing job is complete and as you mentioned this will give us clear idea of what to tune?
    Tuesday, March 16, 2010 1:37 PM
  • Hi,

    Maybe Mosha Pasumanskys MDXStudio can help you track down the calc script issues? You can download it from http://cid-74f04d1ea28ece4e.skydrive.live.com/browse.aspx/MDXStudio/v0.4.14. A forum can be found here: http://www.ssas-info.com/forum/3-mdx-studio/.

    Frank
    Tuesday, March 16, 2010 7:00 PM
  • Thanks Frank!

    I have the MDX studio with me and i will try to troubleshoot the issue using this tool and with the help of forums.

    Once again thanks a lot for your response.
    Wednesday, March 17, 2010 3:25 PM
  • If you have a test server handy you might also want to try testing this on a server patched to at least SP2 CU10 which was a lot more stable than CU6 otherwise go the whole way and patch to the latest and see what happens...

    I must say I have only ever used MDX Studio for diagnosing performance issues with calculations etc. where by that I mean query time calculations. Not sure how you would use it to diagnose calculation script evaluation after processing though. Frank?
    Wednesday, March 17, 2010 10:39 PM
  • Thanks z1rcon!

    I will try to arrange for a test server to test this scenario with SP2 CU 10, rather than SP 3 CU 6

    Meanwhile, when i was debugging the cube script of the problematic cube and i think i found the exact problem and it lies with calculated sets in our cube which uses string functions which ideally makes it slower in performance.. i had read thru forums and mosha blogs and they have mentioned these string functions will cause a extreme delay.. here in i am attaching a sample calculations used in our cube.. i am in the process of optimizing it.. please find the calculations below:

     /****************************************************************
                        Object : Calculated Set
                        Name : [Portfolios]
                        Caption : Returns a set of members that are at the leaf level of the OU portion of the ACO hierarchy. Since the OU hierarchy
                        is ragged, members on the lowest OU level may be hidden and therefore cannot be selected with ease in front-end software.
                      
                        *****************************************************************/

                        CREATE SET CURRENTCUBE.[Portfolios] AS Generate([ACO].[Current].[Group].Members, Ancestors([ACO].[Current].CurrentMember, 1)) ;
     
    /****************************************************************
                        Object : Calculated Set
                        Name : [Previous Quarter]
                        Caption : Return a set containing one member which is the member on the reporting period dimension representative of the Previous Quarter.
             
                        *****************************************************************/

                        CREATE SET CURRENTCUBE.[Previous Quarter] AS {StrToMember("[Reporting Period].[Reporting Period].[" + Format(Now(), "MMMM yyyy") + "]").Parent.PrevMember} ;

    We have 8 calculated sets which uses strtomember function and one set which uses generate function..

    Any thoughts on this are much appreciated!

    Thanks,

    Lokesh
    Thursday, March 18, 2010 10:03 AM
  • Ooops sorry I misread your earlier post. I thought you had said sp2 cu6 where cu 10 is more stable. If you are on sp 3 you should be okay. Ah yes, I remember some of those sets etc. Not pretty. There were some nuances around why we had to it the way it is done there which I can't rember now. If you can, see if you can rather capture current month as an attribute against month and then use "exists". The set is more tricky... As the last fixed level of the hierarchy is group and that the specification of a portfolio is the parent of group, the code derives the portfolios as the parent of group, whatever that might be and at whatetever level. I know it appears odd that the dimension is ragged when the table has fixed levels but the "hide if same name as parent" at all levels above group comes in to play. So with that said, yes the "strto..." functions are a necessary evil sometimes but they are "evil" especially to performance. Block computation goes bye bye and I've seen queries take orders of magnitude longer when using those functions than when not. (caveat: if you use the CONSTRAINED parameter to the strtomember function etc. Block computation is available but the member being converted needs to be "fully qualified". I think I remember that correctly. ) Even so, performance generally isn't good in my experience. Bottom line, if you can refine those or even rip themout and place them in the front end tool if applicable (and not often used) that would help. I know in AS2000 days named sets were evaluated on connection. If memory serves, in AS2005 named sets are evaluated after processing and is thus one of the tasks when re-evaluating the calc. script. Hope that helps and is formatted okay. Typing this on my phone and my hand now hurts, lol. Good luck.
    Friday, March 19, 2010 7:44 AM
  • Yesterday,I was trying to see if there is an attribute agaisnt month to uses exisits but unfortunately i could not find anything of that sort. i am not sure why it is designed in that fashion so cant really comment on this! i have also tried constrained but as you say the member needs to be fully qualified.As of now its fine even i have strto.. it delays 10-15 sec so i thought i can revisit after solving the main devil in our scenario which is the generate function which takes close to 4 min to resolve and thats the killing part..

    As you have mentioned, either we need to refine this calculation else rip it and place them in front end application..but again this is quite oftenly used in most of the users report and they are adapted to use this set by pulling from the cube rather than report specific ..hence i have left with the option to refine it!! hope it makes sense...

    i tried the Below option because in SSAS 2005 it is attribute hierarchy,also it ragged in nature and it was much quicker. it obtained the same results, when compared with results of the exisiting set calcualtion

    CREATE SET CURRENTCUBE.[Portfolios] AS {[ACO].[Current].[portfolio].Members}

    Let me know ur thoughts and if u come across any alternate approach to refine the generate function please shed some light on this!

    Thanks a lot for ur prompt responses and help offered!

    Friday, March 19, 2010 4:19 PM
  • I can't really remember all of this now but I think depending on the client tool being used this should be okay. If I remember correctly, the "hide member if" property is a hint to a client application to perform that function but a client application doesn't have to honour it. As such, using the code you have provided may have produced slightly different results in the dimension browser than the old code in certain client tools but like I say, I honestly cannot remember.

    If the new way works and is quick, then use it. Who am I to argue with that? :-)

     

    Monday, March 22, 2010 1:52 PM
  • lol.. :) Many thanks for all your guidance.. i'm doing some basic test to check if this rule to this calculated set is properly working ... once i can ensure that this works properly ..i can communicate to my team...

     

     

    Monday, March 22, 2010 3:07 PM