locked
SQL Azure - throwing TCP or time out exception in case of large nos of insert - update queries RRS feed

  • Question

  • Hello friends,

    Hopefully I can get some satisfactory and correct solutions in term of Azure SQL insertion/updating queries. Let me explain my points first so that you can have a better idea to response –

    We are moving to SQL server 2008 to Azure SQL so doing some patch or required process but recently got some odd result. Since our main table having approx. 26 million data that used to be update or insert on regular basis so the same we tried to do in Azure SQL. But each time Azure SQL used throw some tcp or time out exception, seems like unable to process large data.

    I tried to do some googling but didn't get any satisfactory response; no doubt there are few options such as batch processing, SSIS, BCP or API etc. but these don’t fulfill my requirement. I need to run couple of queries with large data (for example 10 lakh) in one sequence either insert or update.

    Can anybody suggests the best way to achieve in Azure SQL or is there any limitation?

    Regards

    Rajendra

    Thursday, January 8, 2015 10:57 AM

All replies

  • No one's been able to answer this question in nearly a year?  I'm running into connection timeouts with inserts greater than 10,000 records on my S1 database. I've tried chunking this up into 500 records at a time, with a 60 second sleep between each call, and still that eventually gets a timeout. I tried increasing the DB to a P11 (the largest database I can scale to) and it still times out at the same time.  My personal traditional SQL Server can do this in 2-3 minutes. This is crazy.

    Russ - every question has an answer, but not every asker has the persistence

    Wednesday, November 11, 2015 3:04 AM