none
Multitable Insert in one batch RRS feed

  • Question

  • Hi There,

    My code generates a string that has multiple insert statements separated by semicolon. Then using single command object passing the generated string as command text all insert statements are executed in one trip to sql server using ado.net 2.0. However this way of constructing string and executing as command text is not the right approach and we want to build queries with parameters. By taking this approach it will bring about 30 individual sql commands and executing all the sql commands in a transaction means so many trips to sql server which will be a big performance hit on our server since this exe will be invoked every 5 min. Is there a way in ado.net where I can avoid string concatenation to construct queries and multiple trips to my database but still can use parameterized queries.

    I see sqlbulkinsert object which can be used if i want to insert a batch of rows into one table. However my 30 insert statements insert one row into 30 different tables and hence cannot be used in this case. Dataadapter can do update, insert , delete in one batch but not 30 inserts in one batch. Please throw some light on the approach that I should take to insert into multiple disconnected tables in one trip using parameterized queries.

    Thanks.
    Thursday, March 18, 2010 5:44 AM

Answers

  • > but still can use parameterized querie

    Sure... use a lot of parameters.  For example, append numbers after your parameters to identify the parameters related with a single INSERT statement.

    INSERT INTO Test (ColA, ColB, ColC) VALUES (@A1, @B1, @C1);
    INSERT INTO Test (ColA, ColB, ColC) VALUES (@A2, @B2, @C2);
    INSERT INTO Test (ColA, ColB, ColC) VALUES (@A3, @B3, @C3);
    

    This will avoid the need to concatenate the actual values into the statement, avoiding this significant security concern.  You'll still need to concatenate the parameter placeholders into the string like shown above, but that is not going to be a performance problem.

    There is likely a "break even" point where adding further statements and parameters to the same batch no longer provides better performance than submitting the statements separately.  I don't know if this point is in the hundreds, thousands, etc.  Divide your rows up into reasonable sized batches.

     

     

    Thursday, March 18, 2010 11:46 PM
  • I do not know how many parameters you have, but one potential solution would be to create stored procedure that accepts all the values as parameters and it will have multiple or single (joined) INSERT statement inside. To improve performance you could also use it in combination with SqlBulkCopy class. Here is sample how to call stored procedure from code using SqlBulkCopy

    http://www.sqlteam.com/article/use-sqlbulkcopy-to-quickly-load-data-from-your-client-to-sql-server


    Val Mazur (MVP) http://www.xporttools.net
    Friday, March 19, 2010 10:10 AM
    Moderator

All replies

  • > but still can use parameterized querie

    Sure... use a lot of parameters.  For example, append numbers after your parameters to identify the parameters related with a single INSERT statement.

    INSERT INTO Test (ColA, ColB, ColC) VALUES (@A1, @B1, @C1);
    INSERT INTO Test (ColA, ColB, ColC) VALUES (@A2, @B2, @C2);
    INSERT INTO Test (ColA, ColB, ColC) VALUES (@A3, @B3, @C3);
    

    This will avoid the need to concatenate the actual values into the statement, avoiding this significant security concern.  You'll still need to concatenate the parameter placeholders into the string like shown above, but that is not going to be a performance problem.

    There is likely a "break even" point where adding further statements and parameters to the same batch no longer provides better performance than submitting the statements separately.  I don't know if this point is in the hundreds, thousands, etc.  Divide your rows up into reasonable sized batches.

     

     

    Thursday, March 18, 2010 11:46 PM
  • I do not know how many parameters you have, but one potential solution would be to create stored procedure that accepts all the values as parameters and it will have multiple or single (joined) INSERT statement inside. To improve performance you could also use it in combination with SqlBulkCopy class. Here is sample how to call stored procedure from code using SqlBulkCopy

    http://www.sqlteam.com/article/use-sqlbulkcopy-to-quickly-load-data-from-your-client-to-sql-server


    Val Mazur (MVP) http://www.xporttools.net
    Friday, March 19, 2010 10:10 AM
    Moderator
  • Thank you both for your replies. I will analyze the new approaches for my situation. 
    Sunday, March 21, 2010 8:09 AM