locked
Strategy for a database deployment RRS feed

  • Question

  • Hello,

    I would like to hear from you a strategy for deployments. Here is the scenario.

    The Management and dev team wants me to do deployment without stopping the webservices (per dev team these are webservices and are not console or windows services hance they could not be stopped)  on a prod database. I definitely take a pre deployment database backup and then run the database scripts that were tested on staging/QA/test/development environments on prod env and if scripts run without error then I will take a post deployment database backup. The webservices are up and running while I am doing the database deployment. Now lets say the application is not working  the way its supposed to after the deployment or not working like the way it did in dev or QA env and then we decide to roll back. In this scenario we dont want the customer loose their data submitted during our deployment. I would like to hear your thoughts and come up with a best strategy.

    Thanks in advance



    Wednesday, October 30, 2013 3:01 PM

Answers

    • Now lets say the application is not working  the way its supposed to after the deployment or not working like the way it did in dev or QA env and then we decide to roll back.

    You can do a schema compare between the new and old design to create a set of rollback scripts.  But if the application is not working correctly after the upgrade, you may have logically incorrect data in the database.  Also you will loose any data in new tables or new columns.

    SSDT helps streamline and automate this process: http://msdn.microsoft.com/en-us/data/tools.aspx

    David


    David http://blogs.msdn.com/b/dbrowne/



    Wednesday, October 30, 2013 3:22 PM
  • Whatever combination of DDL and DML the upgrade/downgrade scripts use, SQL Server will use locking to prevent inconsistent database access during the upgrade. 

    If you run the scripts in a transaction with an elevated deadlock priority, then they should succeed, and the users will mostly see just short delays while they wait for the scripts to complete.  If a user is in the middle of a multi-object transaction when the scripts run, there's a chance they will be a deadlock victim.

    David


    David http://blogs.msdn.com/b/dbrowne/

    • Proposed as answer by Sofiya Li Thursday, October 31, 2013 8:41 AM
    • Marked as answer by Sofiya Li Monday, November 4, 2013 12:39 PM
    Wednesday, October 30, 2013 3:35 PM

All replies

    • Now lets say the application is not working  the way its supposed to after the deployment or not working like the way it did in dev or QA env and then we decide to roll back.

    You can do a schema compare between the new and old design to create a set of rollback scripts.  But if the application is not working correctly after the upgrade, you may have logically incorrect data in the database.  Also you will loose any data in new tables or new columns.

    SSDT helps streamline and automate this process: http://msdn.microsoft.com/en-us/data/tools.aspx

    David


    David http://blogs.msdn.com/b/dbrowne/



    Wednesday, October 30, 2013 3:22 PM
    • Now lets say the application is not working  the way its supposed to after the deployment or not working like the way it did in dev or QA env and then we decide to roll back.

    You can do a schema compare between the new and old design to create a set of rollback scripts.

    SSDT helps streamline and automate this process: http://msdn.microsoft.com/en-us/data/tools.aspx

    David


    David http://blogs.msdn.com/b/dbrowne/


    Thanks for the reply. But the rollback includes DDL and DML. So is there a way to do a rollback without  a downtime and there can be customers submitting their data during the rollback too.
    Wednesday, October 30, 2013 3:31 PM
  • Whatever combination of DDL and DML the upgrade/downgrade scripts use, SQL Server will use locking to prevent inconsistent database access during the upgrade. 

    If you run the scripts in a transaction with an elevated deadlock priority, then they should succeed, and the users will mostly see just short delays while they wait for the scripts to complete.  If a user is in the middle of a multi-object transaction when the scripts run, there's a chance they will be a deadlock victim.

    David


    David http://blogs.msdn.com/b/dbrowne/

    • Proposed as answer by Sofiya Li Thursday, October 31, 2013 8:41 AM
    • Marked as answer by Sofiya Li Monday, November 4, 2013 12:39 PM
    Wednesday, October 30, 2013 3:35 PM