locked
Best practice to delete old data RRS feed

  • Question

  • I want to delete data older than 30 days from our database. Currently the size of the database is 30 MB. There are few tables in this database that will to grow to a very huge size(hundreds of GB's) with in a few days time. Even though I have scheduled a full back job that backups to disk once every night, I want to move these specific few tables to a different file group and assign them a separate drive for better usage and performance.

    currently for some of my databases I have created a job that purges older data on an every day basis. But this is also causing fragmentation and we are not in a position of shutting down our apps every night for defragmenting the indexes. hence  we are defraging indexes once every two weeks(per our assigned downtime). Is there a different approach to purge/delete data older than n number of days from the above mentioned specific tables? I can still stick to my existing purge jobs but I am actually looking for any different approaches.

    Thanks for your help in advance.

    Tuesday, September 4, 2012 11:53 AM

Answers

  • checkout partitioning, in the mean time, you can perform online rebuilds of indexes on a regular basis

    Geert Vanhove

    • Marked as answer by Shulei Chen Wednesday, September 12, 2012 8:07 AM
    Tuesday, September 4, 2012 11:55 AM
  • As mentioned earlier, see if you can use Partitioning. This is available only in Enterprise edition. If you don't have Enterprise edition, then you may have to look at deleting the data in small batches. One good article to look at these options is 

    http://blogs.msdn.com/b/bartd/archive/2010/06/01/purging-data.aspx

    http://blogs.msdn.com/b/menzos/archive/2008/06/30/table-partitioning-sliding-window-case.aspx


    Sankar Reddy|http://SankarReddy.com/|http://twitter.com/SankarReddy13/


    • Edited by Sankar Reddy Tuesday, September 4, 2012 4:30 PM
    • Proposed as answer by Shulei Chen Monday, September 10, 2012 10:27 AM
    • Marked as answer by Shulei Chen Wednesday, September 12, 2012 8:07 AM
    Tuesday, September 4, 2012 4:27 PM
  • You can also write a dynamic SQL to generate delete t-sql script for all your purging. Once the T_SQL is prepared you can execute it in small batches and you'll get these two benifits out of it:

    1. NO long term blocking.

    2. Total control - you can stop it anytime and re-initiate since you have seperate batches with single deletes.

    3. good performance.


    Sarabpreet Singh Anand http://www.sqlservergeeks.com/blogs/sarab http://www.sarabpreet.com This posting is provided "AS IS" with no warranties, and confers no rights. Please remember to click "Mark as Answer" and "Vote as Helpful" on posts that help you. This can be beneficial to other community members reading the thread.

    • Marked as answer by Shulei Chen Wednesday, September 12, 2012 8:07 AM
    Monday, September 10, 2012 10:37 AM

All replies

  • checkout partitioning, in the mean time, you can perform online rebuilds of indexes on a regular basis

    Geert Vanhove

    • Marked as answer by Shulei Chen Wednesday, September 12, 2012 8:07 AM
    Tuesday, September 4, 2012 11:55 AM
  • Something like below


    --SQL2008
    insert into db_archive..tbl
    select getdate(),d.*
    from (delete 
            from db..tblwhere.....
            output deleted.*) d
            go


    Best Regards,Uri Dimant SQL Server MVP, http://sqlblog.com/blogs/uri_dimant/

    MS SQL optimization: MS SQL Development and Optimization
    MS SQL Blog: Large scale of database and data cleansing
    MS SQL Consultants: Improves MS SQL Database Performance

    Tuesday, September 4, 2012 11:57 AM
    Answerer
  • I would suggest you that -plan for the best purging activity with your crtieria by knowing the Impact of data loss.... schedule the SQL agent job ...on an your weekly or monthly how u will decide but ensure you also follow the best Backup strategy incase if we you need the data so.

    Rama Udaya.K ramaudaya.blogspot.com ---------------------------------------- Please remember to mark the replies as answers if they help and un-mark them if they provide no help.

    Tuesday, September 4, 2012 2:41 PM
  • As mentioned earlier, see if you can use Partitioning. This is available only in Enterprise edition. If you don't have Enterprise edition, then you may have to look at deleting the data in small batches. One good article to look at these options is 

    http://blogs.msdn.com/b/bartd/archive/2010/06/01/purging-data.aspx

    http://blogs.msdn.com/b/menzos/archive/2008/06/30/table-partitioning-sliding-window-case.aspx


    Sankar Reddy|http://SankarReddy.com/|http://twitter.com/SankarReddy13/


    • Edited by Sankar Reddy Tuesday, September 4, 2012 4:30 PM
    • Proposed as answer by Shulei Chen Monday, September 10, 2012 10:27 AM
    • Marked as answer by Shulei Chen Wednesday, September 12, 2012 8:07 AM
    Tuesday, September 4, 2012 4:27 PM
  • You can also write a dynamic SQL to generate delete t-sql script for all your purging. Once the T_SQL is prepared you can execute it in small batches and you'll get these two benifits out of it:

    1. NO long term blocking.

    2. Total control - you can stop it anytime and re-initiate since you have seperate batches with single deletes.

    3. good performance.


    Sarabpreet Singh Anand http://www.sqlservergeeks.com/blogs/sarab http://www.sarabpreet.com This posting is provided "AS IS" with no warranties, and confers no rights. Please remember to click "Mark as Answer" and "Vote as Helpful" on posts that help you. This can be beneficial to other community members reading the thread.

    • Marked as answer by Shulei Chen Wednesday, September 12, 2012 8:07 AM
    Monday, September 10, 2012 10:37 AM
  • Thanks for the replies..
    Monday, September 10, 2012 2:51 PM