none
Data talbe back to SQL RRS feed

  • Question

  • Next problem. Now that I can read data from an SQL table into a data table for purposes of using a datagridview, I have a similar, but essentially reverse problem.

    Scenario. I have an incoming delimited text file that has already been read into a datatable, call it dt . In effect, I have an entire SQL table in datatable format. I need to insert the contents of this datatable, dt , into a target SQL table. Assume that the target SQL table exists, call it SQLdt . Is there a way to do a kind of bulk insert into SQLdt from dt ? Alternatively, do I have to read each row of dt separately and do an insert into SQLdt for each row of dt ?

    Thanks.
    Monday, November 30, 2009 6:18 PM

Answers

  • To do real bulk operation in this case you need to use SqlBulkCopy class that allows to copy rows from the DataTable into SQL Server. Using SqlDataAdapter does inserts on a one-by-one basis, while SqlBulkCopy uses bulk API which work much faster. The only limitation of this class is that it work only with SQL Server database as a destination
    Val Mazur (MVP) http://www.xporttools.net
    Tuesday, December 1, 2009 10:56 AM
    Moderator
  • Your columns have to line up logically in the source and destination.

    In this bit of code that was run before the code above I query the structure from the destination via the WHERE 1=2 and load that schema into my DataTable tbl. I then query my source and load that data into the DataTable tbl that already has the schema for the destination. I then ran the code from previois post. I cut out the exception and logging code.

    dest is my destination connectin object and source is my source connection object. Make sure you close them.

     using (SqlCommand cmd = new SqlCommand("select * from " + TableName + " where 1=2", dest))
                        {
                            using (SqlDataReader sRdr = cmd.ExecuteReader())
                            {
                                tbl.Load(sRdr);
                                // Load from source into existing schema
                                source.Open();
                                using (OleDbCommand sourceCommand = new OleDbCommand(@"select * from " + TableName, source))
                                {
                                    using (OleDbDataReader dr = sourceCommand.ExecuteReader())
                                    {
    
                                        log.WriteLine("Retrieving table '" + TableName + " from version 7.");
    
                                        try
                                        {
                                            tbl.Load(dr);
                                        }
                                    }
                                }  // sourceCommand
                             }  // sRdr
                        }  // cmd



    Tuesday, December 1, 2009 10:38 PM
  • You are over-complicating this.

    It does not matter where your source is from once you load into a data table. The data table is disconnected from your source. The only issue here is that your data table column layout ( name, type, and size ) must be the same as your destination table in Sql Server. Example, if your Sql destination table has 4 columns named jack, bob, sam, and tom then the data table must have the same columns in the same native order and the same data types. IOW, build your data table structure to match teh destination table and then load it. If you just try and load the data as you build it your structure will match that of your import source, not your destination table.

    Once you have achieved this all you do is call the SqlBulkCopy.WriteToServer() method and pass the data table.

    It does not matter where your source comes from once it is loaded into memory.

    You have the right code structure one method to load your data table and another to write to the server.
    Thursday, December 3, 2009 3:24 PM

All replies

  • Hello composer,

     

    Glad to see you again!

     

    No, we don’t need to read each data row and execute the insert T-SQL one by one.  The SqlDataAdatper can also help us execute the bulk of insert operations. 

     

    For example, we have use the SqlDataAdaper to load the DataTable from database, please refer to the following codes:

    ===========================================================================================
    string connStr = @"Data Source=(local);Initial Catalog=TestDB;Integrated Security=True";

    string cmd = "SELECT PersonID, FirstName, LastName FROM [Person]";

    da = new SqlDataAdapter(cmd, connStr);

    da.Fill(table);

    ===========================================================================================

     

    Then we can insert some new data rows to the data table and then use the SqlDataAdapter again to do the insert operations.

    ===========================================================================================

    table.Rows.Add(1, "Michael", "Sun");

    table.Rows.Add(2, "Lingzhi", "Sun");

     

    SqlCommandBuilder builder = new SqlCommandBuilder(da);

    da.Update(table);
    ===========================================================================================

     

    Here, the SqlCommandBuilder can help us to dynamically create insert/delete/update command based on the select command “SELECT PersonID, FirstName, LastName FROM [Person]”.  So we just need to call the Update method of the SqlDataAdapter. 

     

     

    Additionally, we can also manually set the insert/delete/update command of the SqlDataAdapter, via InsertCommand, DeleteCommand, and UpdateCommand. 

     

     

    If you have any questions, please feel free to tell me.

     

    Have a great day!

     

     

    Best Regards,
    Lingzhi Sun

    MSDN Subscriber Support in Forum

    If you have any feedback on our support, please contact msdnmg@microsoft.com


    Please remember to mark the replies as answers if they help and unmark them if they provide no help.
    Welcome to the All-In-One Code Framework! If you have any feedback, please tell us.
    Tuesday, December 1, 2009 3:39 AM
    Moderator
  • To do real bulk operation in this case you need to use SqlBulkCopy class that allows to copy rows from the DataTable into SQL Server. Using SqlDataAdapter does inserts on a one-by-one basis, while SqlBulkCopy uses bulk API which work much faster. The only limitation of this class is that it work only with SQL Server database as a destination
    Val Mazur (MVP) http://www.xporttools.net
    Tuesday, December 1, 2009 10:56 AM
    Moderator
  • Thanks to you both. I think the SqlBulkCopy construct sounds more like the appropriate approach. Reading in large ascii files that need to be parsed and then written to SQL tables ought to use as little code as possible, or so it seems to me, since it's just a mass transfer of data.

    Of course, one could do this in SQL as well, but the object here is to do it through the C#/Visual Studio interface.

    Can you direct me some code examples using SqlBulkCopy?

    Tuesday, December 1, 2009 3:51 PM
  • Here is a snippet that loads the data in DataTable tbl into a Sql table via SqlBulkCopy. The log object is a reader declared elsewhere that logs errors. tbl was loaded via a datareader object prior to this code.

    The target table must exist and the tbl object must be the sme structure. Variable TableName has the target table name.

     try
                {
                    
                    using (SqlBulkCopy s = new SqlBulkCopy(ConnStr + ";Initial Catalog=pcs", SqlBulkCopyOptions.KeepIdentity))    
                    {
                        s.BatchSize = 500;
                        s.BulkCopyTimeout = 300;
                        s.DestinationTableName = "mydb.dbo." + TableName;
                        s.NotifyAfter = 100;
                        s.SqlRowsCopied += new SqlRowsCopiedEventHandler(s_SqlRowsCopied);
                       
                        s.WriteToServer(tbl);
                        s.Close();
                    }
                   
    
                } // try 
    
                catch (Exception e)
                {
                    log.WriteLine("BulkCopy Exception encountered: " + e.Message.ToString());
                }


    Tuesday, December 1, 2009 4:23 PM
  • Ok - now, I've constructed a method, below,  that takes three parms: the connection string, the specific db..tablename target (i.e. as myDB..myTable), and a datatable object.

    The datatable loads the external ascii data correctly - I know this since, for test purposes, I've interposed a datagridview before the sqlbulkcopy functionality, and the data together with its columns is present as I would expect it. Further, my target SQL table has columns whose names match, exactly, the column headers in the datatable, and whose definitions match the expected data content.

    HOWEVER, when this particular piece of code executes, I get back the following error almost immediately:
    "The given ColumnMapping does not match up with any column in the source or destination."

    Clearly (or maybe not), I've missed something.

    Any thoughts?

    Thanks! :) :(



    public bool SQL_SUR_FileLoader(string myCONN, string myTABLE, DataTable myDT)
            {
                bool good2go = true;

                using (SqlConnection sqlCon = new SqlConnection(@myCONN))
                {
                    try
                    {
                        sqlCon.Open();
                        using (SqlBulkCopy bulkcopy = new SqlBulkCopy(sqlCon))
                        {
                            bulkcopy.DestinationTableName = myTABLE;
                            try
                            {
                                bulkcopy.WriteToServer(myDT);
                            }
                            catch (Exception ex)
                            {
                                MessageBox.Show(ex.Message, "SUR Export to SQL for "+SelectedSTATE+" error");
                                good2go = false;
                            }

                        }
                    }
                    catch (Exception ey)
                    {
                        MessageBox.Show(ey.Message);
                        good2go = false;
                    }

                    sqlCon.Close();



                    return good2go;
                }
            }

    Tuesday, December 1, 2009 10:18 PM
  • Your columns have to line up logically in the source and destination.

    In this bit of code that was run before the code above I query the structure from the destination via the WHERE 1=2 and load that schema into my DataTable tbl. I then query my source and load that data into the DataTable tbl that already has the schema for the destination. I then ran the code from previois post. I cut out the exception and logging code.

    dest is my destination connectin object and source is my source connection object. Make sure you close them.

     using (SqlCommand cmd = new SqlCommand("select * from " + TableName + " where 1=2", dest))
                        {
                            using (SqlDataReader sRdr = cmd.ExecuteReader())
                            {
                                tbl.Load(sRdr);
                                // Load from source into existing schema
                                source.Open();
                                using (OleDbCommand sourceCommand = new OleDbCommand(@"select * from " + TableName, source))
                                {
                                    using (OleDbDataReader dr = sourceCommand.ExecuteReader())
                                    {
    
                                        log.WriteLine("Retrieving table '" + TableName + " from version 7.");
    
                                        try
                                        {
                                            tbl.Load(dr);
                                        }
                                    }
                                }  // sourceCommand
                             }  // sRdr
                        }  // cmd



    Tuesday, December 1, 2009 10:38 PM
  • Little by little... Your example above uses another SQL table as the source for the datatable. In my case, the source is an external ascii, tab-delimited file whose first row contains the column names - which, in any case, are the same as the target columns in the target sql table.

    The code below is what I am using to load the datatable, and you may recognize it. It works, and my datatable is properly loaded.

      // TO CREATE A DATATABLE OBJECT FROM AN INCOMING DELIMITED ASCII FILE

            public DataTable BuildDataTable(string file, string delimiter)
            {
               DataTable dta = new DataTable();
                       
               try
               {
                    //first make sure the file exists
                    if (File.Exists(file))
                    {
                        //create a StreamReader and open our text file
                        StreamReader reader = new StreamReader(file);
                        //read the first line in and split it into columns
                        string[] columns = reader.ReadLine().Split(delimiter.ToCharArray());
                        //now add our columns (we will check to make sure the column doesnt exist before adding it)
                        foreach (string col in columns)
                        {
                            //variable to determine if a column has been added
                            bool added = false;
                           
                            while (!(added))
                            {
                                string columnName = col;
                                //now check to see if the column already exists in our DataTable
                                if (!(dta.Columns.Contains(columnName)))
                                {
                                    //since its not in our DataSet we will add it
                                    dta.Columns.Add(columnName, typeof(string));
                                    added = true;
                                }
                              
                            }
                        }
                       
                        //now we need to read the rest of the text file
                        string data = reader.ReadToEnd();
                        //now we will split the file on the carriage return/line feed
                        //and toss it into a string array
                        string[] rows = data.Split("\r".ToCharArray());
                        //now we will add the rows to our DataTable
                        foreach (string r in rows)
                        {
                            string[] items = r.Split(delimiter.ToCharArray());
                            //split the row at the delimiter
                            dta.Rows.Add(items);
                        }
                    }
                    else
                    {
                        throw new FileNotFoundException("The file " + file + " could not be found");
                    }
               }
               catch (FileNotFoundException ex)
               {
                   string _message;
                   _message = ex.Message;
                   return null;
               }

               //now return the DataSet
               return dta;
            }


    What am I missing here????
    Wednesday, December 2, 2009 4:15 PM
  • Once you load your source data into a properly schemed data table ( in memory ) the bulk loader doesn't care where the source is. Load your source into a data table minus the header row and call the bulk loader with the data table.
    Wednesday, December 2, 2009 8:10 PM
  • Let's start from the top, please. I understand in theory what you've been saying - I can not translate your examples to my code - therefore, I'm going in circles accomplishing nothing. Extremely frustrating.

    Please keep in mind that my SOURCE is an external ascii file. The method/proc public DataTable BuildDataTable(string file, string delimiter) as given in my text is how it's done. Please indicate THERE whatever changes I need to make using the same syntax as the code.

    Similarly, the method/proc public bool SQL_SUR_FileLoader(string myCONN, string myTABLE, DataTable myDT) as given in my text is how I am trying to move the data from the datatable to the SQL table. If changes need to be made in THIS proc/method, please indicate what needs to be changed using the same syntax as in the code.

    I apologize for being so thick - I am trying to understand how to do this and, so far, I'm having no luck at all getting the datatable data into my SQL table.

    Many thanks.
    Wednesday, December 2, 2009 8:23 PM
  • You are over-complicating this.

    It does not matter where your source is from once you load into a data table. The data table is disconnected from your source. The only issue here is that your data table column layout ( name, type, and size ) must be the same as your destination table in Sql Server. Example, if your Sql destination table has 4 columns named jack, bob, sam, and tom then the data table must have the same columns in the same native order and the same data types. IOW, build your data table structure to match teh destination table and then load it. If you just try and load the data as you build it your structure will match that of your import source, not your destination table.

    Once you have achieved this all you do is call the SqlBulkCopy.WriteToServer() method and pass the data table.

    It does not matter where your source comes from once it is loaded into memory.

    You have the right code structure one method to load your data table and another to write to the server.
    Thursday, December 3, 2009 3:24 PM
  • Please forgive my delay in responding to your last comments. The unfortunate combination of work deadlines and illness pretty much trashed everything on this end. The problem with this thread is that your suggestions, which are all excellent, don't jive with the code I've put up as the working example. My incoming file is a comma delimited text file in which every column's data is also enclosed in double-quotes. The mixture of methods - yours and mine - lead me down a rabbit hole from which I could not emerge in C#. I've had to do this as a function of a stored procedure using the SQL bulk insert construct - the schedule of production required me to come up with "the answer" much as it is presently a compromise: it does not help me to learn C# which, in the long run, is the more important goal.

    I will need to take a closer look at this one and try to come up with a viable C# solution.

    I do want to thank you, however, for your patience with me in attempting to find a solution to this problem. There are other issues that I need to bring up in the forum which, I hope, will be a lot less problematical for me.

    Thanks.
    Monday, December 14, 2009 3:37 PM
  • Hello composer,

    How is the problem now?  Do you need any further assistance? 


    Have a nice day!
     

    Best Regards,
    Lingzhi Sun

    MSDN Subscriber Support in Forum

    If you have any feedback on our support, please contact msdnmg@microsoft.com


    Please remember to mark the replies as answers if they help and unmark them if they provide no help.
    Welcome to the All-In-One Code Framework! If you have any feedback, please tell us.
    Monday, December 21, 2009 5:18 AM
    Moderator