locked
Best way to Insert Millions records in SQL Azure on daily basis? RRS feed

  • Question

  • I am maintaining millions of records in Sql Server 2008 R2 and now i am intended to migrate these on SQL Azure.
    In existing system with SQL Server 2008 R2, few SSIS packages and Stored Procedures are firstly truncate the existing records and then perform Insert operation on the table which holds approx 26 Million records in 30 mins. on Daily basis (as system demands).

    When i migrate these on SQL Azure, i am unable to perform these operations in a faster way as i did in SQL 2008. Sometimes i got Request timeout error.

    While searching for faster way, many of them suggest for Batch process or BCP. But Batch processing is NOT suitable in my case because it takes much time to insert those records. I required some faster and efficient way on SQL Azure.

    Hoping for some good suggestions.

    Thanks in advance :)
    Ashish Narnoli
    Thursday, January 8, 2015 11:15 AM

Answers

  • No matter how you do the insert, you still have to send 26 million records over the network. So you are not going to get the kind of performance you are used to. If possible, send the records in a file (.csv for example) to Azure storage first before doing the inserts. It may not be practical or it may not improve performance, but you can look into scaling SQL up to do the inserts and then scale it down so you don't pay as much.

    Frank

    Wednesday, February 11, 2015 1:20 PM

All replies

  • Hi,

    Please have a check on the below link and check if it helps.

    https://alexandrebrisebois.wordpress.com/2012/10/22/inserting-large-amounts-of-data-into-windows-azure-sql-database/

    Regards,

    Mekh.

    Thursday, January 8, 2015 12:48 PM
  • What service tier are you using currently? If you are not using the highest tier (P3), give it a try to see if it speeds up processing. If your system design can accommodate, you can run the insert using the highest tier and then scale back down to a less pricy tier.

    Frank

    Thursday, January 8, 2015 3:06 PM
  • Hi Frank,

    Thanks for your reply. I am upgraded to the Standard S2 Pricing tier but no luck.

    Still stuck into the existing problem. I tried a simple query which returns the MAX records, even this takes much time. These operations are working very fast on MS SQL Server but all stuck while running on Azure SQL.

    Thanks!

    Ashish Narnoli

    Wednesday, February 11, 2015 9:59 AM
  • No matter how you do the insert, you still have to send 26 million records over the network. So you are not going to get the kind of performance you are used to. If possible, send the records in a file (.csv for example) to Azure storage first before doing the inserts. It may not be practical or it may not improve performance, but you can look into scaling SQL up to do the inserts and then scale it down so you don't pay as much.

    Frank

    Wednesday, February 11, 2015 1:20 PM
  • +1 to Frank's advice.

    Also, please upgrade your Azure SQL Database server to V12 as you will receive higher performance on the premium tiers.  As you scale-up your database for your bulk insert, remember that SQL Database charges by the hour. To minimize costs, scale back down when the inserts have completed.

    Wednesday, February 11, 2015 2:08 PM