locked
SQL Compatibility Issue RRS feed

  • Question

  • We have changed the SQL Server compatibility from 100 to 130 the below code in our stored procedure behaves differently:

    DECLARE @i decimal(4, 0) = 1,
            @j decimal(4, 1) = 0.5
    
    SELECT SUM(@i) - SUM(@j)

    Result of the select:

    • Compatibility Level 100: 0
    • Compatibility Level 130: 1

    We are not sure how many calculation we have in our code like this. We ran the Microsoft Data Migration Assistant and there is no recommendations around data types .

    Please let us know how to uncover this kind of issues..Is there any setting in the SQL Server database we can make to work same as compatibility level 100?

    Wednesday, June 10, 2020 4:58 PM

All replies

  • I confirmed what you are seeing.

    Compatibility level 100-120 = 0

    Compatibility level 130-150 = 1

    This is a complicated problem because you are subtracting different "scale" of decimal(4,0) - decimal(4,1).  This should result in a decimal(4,1) according to the rules:

    https://docs.microsoft.com/en-us/sql/t-sql/data-types/precision-scale-and-length-transact-sql?view=sql-server-ver15

    According to the documented rules, it looks like they are both wrong.  The result should be 0.5. 

    Any decimal(4,0) - decimal(4,1) = decimal(4,1).  This works correctly.

    Any SUM(decimal(4,0)) - SUM(decimal(4,1)) = decimal(38,0) - decimal(38,1) = decimal(38,1).  According to https://docs.microsoft.com/en-us/sql/t-sql/functions/sum-transact-sql?view=sql-server-ver15#return-types


    Wednesday, June 10, 2020 5:37 PM
  • Additional interesting information:

    DECLARE @i decimal(4, 0) = 1,
            @j decimal(4, 1) = 0.5
    
    SELECT CAST(@i AS DECIMAL(38,0)) - CAST(@j AS DECIMAL(38,1)) = 0 in 100-120, 1 in compatibility 130-150 
    
    SELECT CAST(@i AS DECIMAL(37,0)) - CAST(@j AS DECIMAL(38,1)) = 0.5 in all compatibility levels
    


    Wednesday, June 10, 2020 6:03 PM
  • Thanks Tom..I need the recommendations which works different in compatibility 100 and 130 and the steps needs to fix this issues.

    Wednesday, June 10, 2020 6:46 PM
  • You are not going to be able to fix this. This is going to require MS to fix the code and patch it.

    Wednesday, June 10, 2020 6:56 PM
  • Do you know why the link doesn't apply to SQL Azure? BTW, I'm also getting 1 and even if I try

    DECLARE @i decimal(3, 0) = 1,
            @j decimal(4, 1) = 0.5
    
    SELECT cast(SUM(@i) - SUM(@j) as decimal(5,1));
    
    
    SELECT CAST(@i AS DECIMAL(38,0)) - CAST(@j AS DECIMAL(38,1)) --= 0 in 100-120, 1 in compatibility 130-150 
    
    SELECT CAST(@i AS DECIMAL(37,0)) - CAST(@j AS DECIMAL(38,1)) --= 0.5 in all compatibility levels

    Running against Azure DB in latest compatibility level.


    For every expert, there is an equal and opposite expert. - Becker's Law


    My blog


    My TechNet articles

    Wednesday, June 10, 2020 7:25 PM
  • I've been looking at this for a while, and I guess some people think that the old result was a bug, and the new is more correct. But I am not sure.  Here is a simplified version of the problem:

    DECLARE @i decimal(38, 0) = 1,
            @j decimal(38, 1) = 0.5
    SELECT  @i - @j, convert(decimal(38,0), @j)

    The convert always result in 1, no matter the compat level. And as I would expect @j to be converted in the subtraction before anything else, I would execpt the subtraction to return 0.

    But since there is different behaviour depending on compat level, it seems that this was an intentional change.

    If there is a way to get the old behaviour in compat level 130? I don't know. If this is a blocking issue for, you could consider to open a support case. If you can make your business case loud and clear why you need to the old behaviour, it is possible that they will add a trace flag. But I cannot give any guarantees.


    Erland Sommarskog, SQL Server MVP, esquel@sommarskog.se

    Wednesday, June 10, 2020 7:30 PM
  • IMHO, both results are incorrect. The correct result would be 0.5 :)

    For every expert, there is an equal and opposite expert. - Becker's Law


    My blog


    My TechNet articles

    Thursday, June 11, 2020 3:04 AM
  • This is a documented breaking change for SQL Server 2016.

    The result of 0.5 is rounded to 1 to fit in the computed result type of decimal(38,0).

    There is no data type precedence involved because both inputs are decimal. The precision and scale of the result are computed as documented. Subtracting decimal(38,1) from decimal(38,0) results in a desired type of decimal(77, 1) following the computations detailed on that page. Precision is capped to the maximum of 38. Scale is reduced to zero, as also documented there:

    In addition and subtraction operations, we need max(p1 - s1, p2 - s2) places to store integral part of the decimal number. If there isn't enough space to store them that is, max(p1 - s1, p2 - s2) < min(38, precision) - scale, the scale is reduced to provide enough space for integral part. Resulting scale is MIN(precision, 38) - max(p1 - s1, p2 - s2), so the fractional part might be rounded to fit into the resulting scale.

    Performing the rounding after the subtraction ensures the result is the same as CONVERT(decimal(38,0), 0.5).

    As noted, earlier versions would sometimes round too early. Rounding the 0.5 result of the second sum to 1 gives a calculation of 1 - 1 = 0.

    You cannot preserve the (incorrect) pre-2016 behaviour without using a database compatibility level of less than 130.

    I don't know what the real code does (SUM on a variable does not make much sense), but you should be able to address the problem using explicit data types instead of relying on the implicit ones. There is no reason to suspect that your SUM of small decimals will really require a precision of 38.

    e.g. 

    SELECT CONVERT(decimal(9, 0), SUM(@i)) - CONVERT(decimal(9,1), SUM(@j))
    • Proposed as answer by Naomi N Thursday, June 11, 2020 2:02 PM
    Thursday, June 11, 2020 3:37 AM
  • Hi Rajesh,

    This issue could be reproduced after changeing the  compatibility level. 

    But as mentioned by other experts, the correct result of this query should be 0.5 instead of 1 or 0.

    Below is one example which is correct.

    DECLARE @i decimal(3, 0) = 1,

            @j decimal(4, 1) = 0.5

    SELECT CONVERT(decimal(9, 0), SUM(@i)) - CONVERT(decimal(9,1), SUM(@j))


    To my knowledge, there is not any proper method to find this kind os issue. Perhaps trace flag and trace file may be helpful.

    If you insist on getting the result the same as compatibility level 100, you could follow below few methods to manually change the compatibility level of specific database.

    1.Right click the database->click propertites->Click Options ->Select  compatibility level from dropdown.

    2. Execute below code as an example:

    ALTER DATABASE AdventureWorks2017

    SET COMPATIBILITY_LEVEL = 100;

    3. You could try to add some functions like cast ,convert or round.

    I tried to use round function like below. The result could be the same as the one in compatibility level 100 after changing the values of @i and @j. 

    DECLARE @i decimal(4, 0) = 1,

            @j decimal(4, 1) = 0.5

           

    SELECT SUM(@i) - SUM(@j)   --0  in compatibility level 100 

    SELECT SUM(round(@i,0)) - SUM(round(@j,0))   ---0  in compatibility level 130


    You could have a try with this function. But I cannot give any guarantees whether it could fit all situations.

    Best regards,

    Melissa

    -------------------------------------------

    MSDN Community Support

    Please remember to click "Mark as Answer" the responses that resolved your issue, and to click "Unmark as Answer" if not. This can be beneficial to other community members reading this thread. If you have any compliments or complaints to  MSDN Support, feel free to contact MSDNFSF@microsoft.com


    Thursday, June 11, 2020 6:49 AM
  • Good find.  Thank you

    Thursday, June 11, 2020 12:52 PM
  • As Paul pointed out, your only options are to either 1) use a compatibility level less than 130 or 2) change the code and run in a higher compatibility level.

    The DMA is a good start but it will not identify all issues. You still need to review the breaking change documentation and test your code.

    Dan Guzman, Data Platform MVP, http://www.dbdelta.com

    Saturday, June 13, 2020 7:33 PM