none
Merge Statement Error Msg 8672, Level 16, State 1 in SQL Server 2014 RRS feed

  • Question

  • I am trying to do update/insert operation using merge statement, but getting below error:

    "

    The MERGE statement attempted to UPDATE or DELETE the same row more than once. This happens when a target row matches more than one source row. A MERGE statement cannot UPDATE/DELETE the same row of the target table multiple times. Refine the ON clause to ensure a target row matches at most one source row, or use the GROUP BY clause to group the source rows.

    "

    I am using SQL Server 2014. I knew there was a bug in older version of SQL Server (like version 2008). Does this issue still persist in latest version of SQL Server?

    Wednesday, August 5, 2015 12:00 PM

Answers

  • You need to fix the source of the problem, not apply patches. If your source has duplicate rows for the same key used to update, when how would you know which row will be used to update? The simple solution is to eliminate duplicates by either using correct combinations of keys to join or creating an extra key with ROW_NUMBER() approach.

    For every expert, there is an equal and opposite expert. - Becker's Law


    My blog


    My TechNet articles

    • Marked as answer by kumarvik Thursday, August 6, 2015 12:56 PM
    Thursday, August 6, 2015 12:33 PM
    Moderator

All replies

  • I cannot say there are no bugs in 2014, but almost certainly you are getting this error because your merge statement is doing exactly what this error message says, "The MERGE statement attempted to UPDATE or DELETE the same row more than once. This happens when a target row matches more than one source row."

    If you post your table structure (including any constraints (primary key, foreign key, etc) and the merge statement you are running, we may be able to help you find the row(s) that are causing the duplicate matches.

    Tom

    Wednesday, August 5, 2015 4:40 PM
  • Hi Kumarvik,

    This error will occur when you join on incomplete keys.So please check your query.


    Please Dont forget to mark as answer. It helps others to find relevant posts to the same question. Milan Das

    Wednesday, August 5, 2015 4:46 PM
  • The MERGE statement attempted to UPDATE or DELETE the same row more than once. ..

    I am using SQL Server 2014. I knew there was a bug in older version of SQL Server (like version 2008). Does this issue still persist in latest version of SQL Server?

    I am curious; what did you expect to happen when yousimultaneously UPDATE and DELETE the same row? 

    --CELKO-- <SNIP>


    Wednesday, August 5, 2015 10:39 PM
  • Check whether any duplicate values  exist in your data set .

    Amit kr mishra, IT Consltant

    Wednesday, August 5, 2015 10:42 PM
  • @Kumarvik, there might be 2 reasons:

    1. Your source/target tables might be having DUPLICATES.

    2. There might NO DUPLICATES, but you might be using NOLOCK hint? If yes, then your source table might be very big and going through constant changes (INSERT/UPDATE/DELETE), which might be causing PAGE-SPLITS and causing to read single source row TWICE. Try removing NOLOCK and run your MERGE stmt.


    ~manoj | email: http://scr.im/m22g
    http://SQLwithManoj.com
    MCCA 2011 | My FB Page

    Thursday, August 6, 2015 6:09 AM
  • Hello, I know the problem that my source table has duplicate rows and Merge will throw this error. So that to avoid this error, I use update and insert statement separately. Recently I have seen answer given to same question on this link, which states about version of SQL server being used. I checked Merge statement in SQL Server 2014 version by adding duplicate rows in source table and got same expected error. Also this Fix from Microsoft provoked me to ask this question here to get experts view.

    Thursday, August 6, 2015 10:09 AM
  • You need to fix the source of the problem, not apply patches. If your source has duplicate rows for the same key used to update, when how would you know which row will be used to update? The simple solution is to eliminate duplicates by either using correct combinations of keys to join or creating an extra key with ROW_NUMBER() approach.

    For every expert, there is an equal and opposite expert. - Becker's Law


    My blog


    My TechNet articles

    • Marked as answer by kumarvik Thursday, August 6, 2015 12:56 PM
    Thursday, August 6, 2015 12:33 PM
    Moderator