none
Insert/Update large amount of data (3mio+) to SQL Server RRS feed

  • Question

  • Hello

    I have to import weekly a file into our database. The columns from the file do not match the database, so I have to parse the file and build the query. From this point I can insert the data.

    But the next problem is, if a record with same id already exists, I have to update, else I have to delete.

    What's the best practice to get fast import? I currently require something like 1-2 minute / 10000 records. So it takes to long.

    Thanks for any suggestions, ideas, etc.

    Friday, March 6, 2009 2:42 PM

All replies

  • That sort of response time is quite ambitious for a bulk insert that requires an integrity check and a decision as to whether an update or insert will occur, but probably the most efficient method would be to put all of your logic in a stored procedure.
    Paul ~~~~ Microsoft MVP (Visual Basic)
    Sunday, March 8, 2009 1:53 PM
  • You could try to use SQL Server Integration Services in this case. SSIS is designed to provide good performance during large imports/export of data between different datasources.
    Val Mazur (MVP) http://www.xporttools.net
    Monday, March 9, 2009 10:22 AM
    Moderator