locked
Deleting Records from the Production Causes Blocks RRS feed

  • Question

  • Hi Guru,

    I have a problem. I  trying to delete the records from few tables in the Production Database. But When i execute the delete command, I see lot of blocking. The production application gets some error. Basically my delete command causes so many blocks. 

    Is there any way that i can delete the Data from the tables in the Production database.

    Please help out with.

     

    Thanks

     

    Tuesday, June 29, 2010 4:06 PM

Answers

  • Divide your DELETE operation into small batches , something like that

    WHILE 1 = 1
    BEGIN

       DELETE TOP (500000)
       FROM tbl WHERE..........;

       IF @@ROWCOUNT < 500000 BREAK;

    END


    Best Regards, Uri Dimant SQL Server MVP http://dimantdatabasesolutions.blogspot.com/ http://sqlblog.com/blogs/uri_dimant/
    Wednesday, June 30, 2010 6:51 AM
  • Thanks for your help guys

    It's working fine now. I am deleting records in batches. Sp loops in the temp table to get the ID's to delete and deleting 500 record at a time from the Production with delay of 2 sec  after every deleting . It takes time but helps a lot to prevent the blocking in the Production tables.

     

    Thanks 

    Wednesday, July 21, 2010 3:55 PM

All replies

  • What transaction isolation level are you using?

    A couple advices:

    1. Make sure delete statements are using indexes to locate correct rows

    2. Make your transactions shorter if you can. You can do transaction per delete statement (deletion from 1 table). If you're trying to delete massive number of records, you can even split deletions to some packets  - for example if you need to delete months worth of data it could make sense to try to split it on 4-5 separate deletions one week each (every delete runs in own transaction)..

    Tuesday, June 29, 2010 5:27 PM
  • Divide your DELETE operation into small batches , something like that

    WHILE 1 = 1
    BEGIN

       DELETE TOP (500000)
       FROM tbl WHERE..........;

       IF @@ROWCOUNT < 500000 BREAK;

    END


    Best Regards, Uri Dimant SQL Server MVP http://dimantdatabasesolutions.blogspot.com/ http://sqlblog.com/blogs/uri_dimant/
    Wednesday, June 30, 2010 6:51 AM
  • Thanks for replying.  we are using the Read committed Isolation level. I  tried to change it to read read committed snapshot  isolation level to reduce the blocking but changing of the isolation was also blocked everything else. Not able to change the isolation level. Deleting in small transaction would be good idea. I m deleting  the rows on the Date basis but not using any index. I tried deleting table by table. All tables does not take time but for one of the table transaction keeps running . Any suggestions ? 

    Thanks 

    Wednesday, June 30, 2010 1:38 PM
  • Gurbir, add index on Date column. This particular delete statement is doing table scan. It tries to obtain locks during scan operations for entire set of data and blocked by other processes.

    If you are going to implement deletion by smaller chunks , again make sure that indexes are used. Otherwise you will end up with even worse situation.

    Also if you have foreign keys reference tables you are deleting data from - make sure that child tables have indexes on the referenced fields.

     

    Wednesday, June 30, 2010 2:06 PM
  • Thanks for your help guys

    It's working fine now. I am deleting records in batches. Sp loops in the temp table to get the ID's to delete and deleting 500 record at a time from the Production with delay of 2 sec  after every deleting . It takes time but helps a lot to prevent the blocking in the Production tables.

     

    Thanks 

    Wednesday, July 21, 2010 3:55 PM