none
Change in compatibility level from 90 to 110 leading to excessive blocking

    Question

  • Hi,

    We have a machine running on Windows Server 2008 R2 64-bit Standard Edition SP1. On it we have a SQL Server 2008 R2 Standard Edition SP3 instance. This instance has a database running with compatibility level 90. to upgrade the database, we installed another instance of SQL Server 2012 Standard Edition SP3. We then restored the database backup from 2008R2 to 2012 instance and change the compatibility level to 110. This led to excessive blocking even in testing phase. When the compatibility was changed back to 90, issue got resolved. Please advice with possible reason and workaround as we need to change it to 110.

    Thanks in advance.

    Thursday, December 15, 2016 7:55 AM

All replies

  • As stated here on MSDN for the ALTER DATABASE SET COMPATIBILITY_LEVEL:

    Compatibility Levels and Stored Procedures

    When a stored procedure executes, it uses the current compatibility level of the database in which it is defined. When the compatibility setting of a database is changed, all of its stored procedures are automatically recompiled accordingly.

    Instead of going back and forth between compatibility levels you should try optimizing the procedure so the query plan generated is more optimal for the range of parameter values being passed


    Best Regards,Uri Dimant SQL Server MVP, http://sqlblog.com/blogs/uri_dimant/

    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Consulting: Large scale of database and data cleansing
    Remote DBA Services: Improves MS SQL Database Performance
    SQL Server Integration Services: Business Intelligence

    Thursday, December 15, 2016 8:16 AM
  • Hi

    The best practice would be

    Changing the compatibility level while users are connected to the database can produce incorrect result sets for active queries. For example, if the compatibility level changes while a query plan is being compiled, the compiled plan might be based on both the old and new compatibility levels, resulting in an incorrect plan and potentially inaccurate results. Furthermore, the problem may be compounded if the plan is placed in the plan cache and reused for subsequent queries. To avoid inaccurate query results, we recommend the following procedure to change the compatibility level of a database:
    1) Set the database to single-user access mode by using ALTER DATABASE SET SINGLE_USER.
    2) Change the compatibility level of the database.
    3) Put the database in multi user access mode by using ALTER DATABASE SET MULTI_USER.
    4) For more information about setting the access mode of a database, see ALTER DATABASE (Transact-SQL).

    * Extracted from BOL


    Ramesh. M

    Thursday, December 15, 2016 8:32 AM
  • Apparently you have a query or two for which the query plan is different for compatibility levels 90 and 110. Or at least so it seems, but there could be other issues involved. But until the opposite is proven, you can assume that the compatibility level is the issue.

    You need to track which query or queries that are involved. Once you know the queries, there is all reason to investigate if better indexes can be added to support these queries or if the queries need some other tweak.

    Thursday, December 15, 2016 10:22 PM
  • Compatibility levels should not be an issue. see here: 

    http://www.sqlmvp.org/sql-server-database-compatibility-level-performance/

    http://blogs.lessthandot.com/index.php/datamgmt/dbadmin/upgrading-comp-level-sql-server/

    Friday, December 16, 2016 6:00 AM