none
How to sqlbulkcopy large csv file to Sql server? RRS feed

  • Question

  • Hi,

    I have a large csv file, if I read the data in 1 DataTable, it will exceed the max rows allowed.

    If I save the data in batch in multiple DataTables, and  SQLBulkCopy each data table to same dest DB table, will it insert value into the dest table?

    Thanks,

    Nic

    Wednesday, February 17, 2016 12:08 AM

Answers

All replies

  • I think I found it. If you have multiple data tables and call SQLBulckCopy one by one, it will insert data to dest DB table.
    Wednesday, February 17, 2016 1:09 AM
  • Hi Niel1,

    >>If I save the data in batch in multiple DataTables, and SQLBulkCopy each data table to same dest DB table, will it insert value into the dest table?

    You could create a BulkInsertCopy method which has a parameter with data type DataTable. And then loop calling the method and insert your data.

    static void Main(string[] args) { List<DataTable> listTables = new List<DataTable>();

    DataTable dt= RetrieveYourDataTableMethod();
    listTables.add(dt);

    DataTable dt2= RetrieveYourDataTableMethod2();

    listTables.add(dt2);

    foreach (DataTable fdt in listTables) { InsertDataIntoSQLServerUsingSQLBulkCopy(fdt); } } static void InsertDataIntoSQLServerUsingSQLBulkCopy(DataTable csvFileData) { string cs = @"Data Source=(localdb)\MSSQLLocalDB;Initial Catalog=Test;Integrated Security=True"; using (SqlConnection dbConnection = new SqlConnection(cs)) { dbConnection.Open(); using (SqlBulkCopy s = new SqlBulkCopy(dbConnection)) { s.DestinationTableName = "BulkInsertTable"; foreach (var column in csvFileData.Columns) s.ColumnMappings.Add(column.ToString(), column.ToString()); s.WriteToServer(csvFileData); } } }   

    Best regards,

    Cole Wu
    Wednesday, February 17, 2016 6:18 AM
    Moderator