none
SqlBulkCopy - Recommended BatchSize's?

    Question

  • Hello,

    Is there any rule of thumb for deciding what to set the BatchSize when using SqlBulkCopy?

    When inserting around 20,000 rows?

    When inserting around 100,000? What about 500,000?

    Is there any guide that could point me towards any 'rule of thumb's? Or is the only way to test with my specific data and with different BathSizes? I have googled but could not come across any 'guide' such as this.

    Thanks,

    Drew

    Monday, November 20, 2006 10:24 PM

Answers

  • A lot of factors can affect your scenarios. Basically, increasing the batch size can improve performance. But constructing the batch in client and handling the results in the server also have some overhead. So you don't want to set it to 0 (one batch).

    Usually batch size between 100 and 1000 rows can generate good performance result. But you really have to perform your own testing if you want the optimal one in your own scenario.

    Tuesday, November 21, 2006 4:02 AM

All replies

  • A lot of factors can affect your scenarios. Basically, increasing the batch size can improve performance. But constructing the batch in client and handling the results in the server also have some overhead. So you don't want to set it to 0 (one batch).

    Usually batch size between 100 and 1000 rows can generate good performance result. But you really have to perform your own testing if you want the optimal one in your own scenario.

    Tuesday, November 21, 2006 4:02 AM
  • I guess the batch size also depends on the amount of data you are transferring in the batch. If your rows contain Images or Say big blobs or XML files then how to decide what will be the optimal batch size. 

    What is the optimal amount of data (in terms of SIZE, 10KB, 10MB, 1GB )a batch should process?
    Tuesday, September 29, 2009 9:44 AM