locked
Insert multiple rows into database RRS feed

  • Question

  • I have to perform a insert operation into database.
    The insert query has to insert multiple records, the number of records are dynamic.

    I made a function which accepts the input array of structure.
    structure contains information of rows to be added into the database.
    The array length is dynamic and is not known untill runtime.

    How should i perform this operation.
    A single insert sql  would not work.
    Should i make a dataset and then populate it to database.
    please suggest a good approach to solve this problem.
     
    Friday, August 21, 2009 4:32 AM

Answers

  • You are probably best to insert the data into a dataset and then use an adapter to update it from there (we don't work this way at my office, so I don't have any sample code, perhaps someone else can help or you could google for sample code).

    If you are using sql server 2005+ you could also convert the data to xml, pass it into a stored procedure and use xquery inside the stored procedure to do bulk inserts from the xml parameter, but since there are no statistics on the xml parameter this can cause the optimiser problems.

    Depending on the amount of data, you might want to write it to a CSV file (or some other compatible format) and investigate the SqlBulkCopy utility/commands, which will insert large amounts of data as fast as possible.

    If you're not inserting truly huge amounts of data the data set is probably best though.

    The other option is to insert one row at a time, inside a database transaction... but whether that works for you depends on your exact problem, the quantity of data etc.
    • Proposed as answer by Figo Fei Monday, August 24, 2009 9:25 AM
    • Marked as answer by Figo Fei Wednesday, August 26, 2009 10:25 AM
    Friday, August 21, 2009 4:41 AM

All replies

  • You are probably best to insert the data into a dataset and then use an adapter to update it from there (we don't work this way at my office, so I don't have any sample code, perhaps someone else can help or you could google for sample code).

    If you are using sql server 2005+ you could also convert the data to xml, pass it into a stored procedure and use xquery inside the stored procedure to do bulk inserts from the xml parameter, but since there are no statistics on the xml parameter this can cause the optimiser problems.

    Depending on the amount of data, you might want to write it to a CSV file (or some other compatible format) and investigate the SqlBulkCopy utility/commands, which will insert large amounts of data as fast as possible.

    If you're not inserting truly huge amounts of data the data set is probably best though.

    The other option is to insert one row at a time, inside a database transaction... but whether that works for you depends on your exact problem, the quantity of data etc.
    • Proposed as answer by Figo Fei Monday, August 24, 2009 9:25 AM
    • Marked as answer by Figo Fei Wednesday, August 26, 2009 10:25 AM
    Friday, August 21, 2009 4:41 AM
  • Use foreach loop with your array.

    Or You can do with the DataSet With Calling DataAdapter.Update
    Please mark the post as answer if it is helpfull to you because it boosts the members to answer more and more.
    Friday, August 21, 2009 4:43 AM