locked
What causes the OlapQueryLog to stop populating? RRS feed

Answers

  • Hello Elizabeth,

    If your QueryLogConnectionString is populated and no usage data is collected, I would suggest to change server where log table is located and if it is remote - make it local, additionally, check SQL Server error log for any errors which might be related to the QueryLog database. Look in the server event log for events and errors associated with the Analysis Services query log.

    For how long you are running query sampling? Usually you need to have about 24 hours of data collection for usage based optimization wizard, if your QueryLogSampling is set to 1, this could create huge number of records in the log table.

    SSAS maintains QueryLog table and deletes old history records so it is not grow too large, you might want to check time last query was added (StartTime column).

    More information can be found here: http://technet.microsoft.com/en-us/library/cc917676.aspx

     

    Best regards, Vlad.

     

    • Marked as answer by Vlad Ts - MSFT Wednesday, February 22, 2012 9:24 PM
    Tuesday, January 3, 2012 11:20 PM
  • It appears that the issue with the OLAPQueryLog stopping (the log was local to the server) may have been due to the size of the log.  I've not had the issues since I've begun a process to copy the Query log records (then delete any records older than 6 months) to another database; thus shrinking the OLAPQueryLog from over 47.7 million records to just over 1.9 million records.

    Unfortunately, we're still "old-school" and are using the Analysis Services 2005 OLAPQueryLog sampling set to 1 (with a daily job to copy records) as a way to track counts of usage until we can move to the newer versions of SQL Server and their data management tools.  We realize that usage is not a one-to-one with a customer's query (as a query could execute multiple sub-queries) but it's a "close-enough" metric for our purposes and for us to analyze which teams are using the cubes.  And we've got the job to export the results as we realize that the log will auto-delete log records when the cube structure has changed and the usage metrics (dataset) are no longer relevant for usage based optimization.

    In the meantime, we have the daily export process and have built a cube off our usage log data to do some high-level analysis... that's taken with a grain of salt.

    Thanks so much for the additional resources and suggestions.  I appreciate your time!


    Elizabeth Priddy

    • Marked as answer by Vlad Ts - MSFT Wednesday, February 22, 2012 10:00 PM
    Wednesday, February 22, 2012 9:51 PM

All replies

  • Hello Elizabeth,

    If your QueryLogConnectionString is populated and no usage data is collected, I would suggest to change server where log table is located and if it is remote - make it local, additionally, check SQL Server error log for any errors which might be related to the QueryLog database. Look in the server event log for events and errors associated with the Analysis Services query log.

    For how long you are running query sampling? Usually you need to have about 24 hours of data collection for usage based optimization wizard, if your QueryLogSampling is set to 1, this could create huge number of records in the log table.

    SSAS maintains QueryLog table and deletes old history records so it is not grow too large, you might want to check time last query was added (StartTime column).

    More information can be found here: http://technet.microsoft.com/en-us/library/cc917676.aspx

     

    Best regards, Vlad.

     

    • Marked as answer by Vlad Ts - MSFT Wednesday, February 22, 2012 9:24 PM
    Tuesday, January 3, 2012 11:20 PM
  • It appears that the issue with the OLAPQueryLog stopping (the log was local to the server) may have been due to the size of the log.  I've not had the issues since I've begun a process to copy the Query log records (then delete any records older than 6 months) to another database; thus shrinking the OLAPQueryLog from over 47.7 million records to just over 1.9 million records.

    Unfortunately, we're still "old-school" and are using the Analysis Services 2005 OLAPQueryLog sampling set to 1 (with a daily job to copy records) as a way to track counts of usage until we can move to the newer versions of SQL Server and their data management tools.  We realize that usage is not a one-to-one with a customer's query (as a query could execute multiple sub-queries) but it's a "close-enough" metric for our purposes and for us to analyze which teams are using the cubes.  And we've got the job to export the results as we realize that the log will auto-delete log records when the cube structure has changed and the usage metrics (dataset) are no longer relevant for usage based optimization.

    In the meantime, we have the daily export process and have built a cube off our usage log data to do some high-level analysis... that's taken with a grain of salt.

    Thanks so much for the additional resources and suggestions.  I appreciate your time!


    Elizabeth Priddy

    • Marked as answer by Vlad Ts - MSFT Wednesday, February 22, 2012 10:00 PM
    Wednesday, February 22, 2012 9:51 PM