Is there any Best way for import more than 20 lacks records in table from one server to another sqlserver
Tuesday, January 24, 2012 9:39 AMHello team..I want know is there any best way to import more than 20 lacks records in table between two sqlserver ..I tried bulk insert & import\Export ,opendatasource , openrowset,openquery ans siss ...still i am facing problem like its taking hardly more than 3 hours..so now i have to minimize this operational task..Share me your views on theseThanks advance to all
Tuesday, January 24, 2012 6:50 PM
There is some guidance on this in the SQL Server documentation and another paper. See:
"Importing and Exporting Bulk Data"
And maybe this is more what you are looking for, since the topic is Performance.
"The Data Loading Performance Guide"
- Proposed As Answer by amber zhang Thursday, January 26, 2012 1:50 AM
Monday, January 30, 2012 9:27 AMhello team............. can any one simplify the mention task ...
Monday, January 30, 2012 1:54 PM
You are importing 2,000,000 rows from one server to another. At your current rate, you are only getting 185 rows per second. Which is very slow for a raw bulk import.
But there are many details you would have to supply before anybody could make a direct recommendation. Which is why I pointed you to the "The Data Loading Performance Guide":
If you can recognize your problem from that whitepaper then you will likely have an idea of what to do. Questions that could be considered include:
- Are you handling this as a single transaction?
- Have you used features to limit the batch sizes that are processed? Are you importing into a staging table or a live table?
- Is the data sorted to match the clustered index such that you can used the ORDER hint?
- Is locking and blocking a problem?
- Have you checked the wait states to see where the time is going?
- Can you drop indexes for the insert and recreate them afterward?
Just knowing what tool you use, BULK INSERT, bcp, openrowset, SSIS, and so on does not answer the question of how you are inserting and what you have do to optimize the inserts.
Monday, January 30, 2012 4:36 PM
Sometimes I might just use the Data Import Wizard, which uses SSIS to create a package that will transfer data. You can save that package, of course.
I also sometimes use BCP as follows:
BCP sourcedatabase.dbo.sourcetable OUT \\server\share\path\table.dat -S sourceserver -n -T BCP localdatabase.dbo.localtable IN \\server\share\path\table.dat -S localserver -T -b 10000
You will notice in this case that the -b 10000 means that every 10000 rows are committed as individual transactions. This greatly reduces the transaction overhead. Other problems as asked above may still cause you delays.
- Marked As Answer by Stephanie LvModerator Wednesday, February 01, 2012 7:44 AM