none
How can I use SqlBulkCopy on tables with uniqueidentifier columns RRS feed

  • Question

  • I don't know if this is the right place to ask this question, sorry if not.

     

    I want to upsize tables at customer site from VFP tables to SQL server tables. I create the database, tables etc and start to upload data using ADO or ODBC in between but it is very slow. So instead I wanted to use SqlBulkCopy. It works much better and fast but nothing is perfect it has some problems which I think is a bug.

     

    Problem:

    Code Snippet

       OleDbCommand sourceCommand = new OleDbCommand(@"select * from sourceTable", source);

     

        using (OleDbDataReader dr = sourceCommand.ExecuteReader())
        {
            using (SqlBulkCopy s = new SqlBulkCopy(destl))
            {
                s.BatchSize = 500;
                s.BulkCopyTimeout = 300;
                s.DestinationTableName = "myTable";
                s.NotifyAfter = 10000;
                s.SqlRowsCopied += new SqlRowsCopiedEventHandler(s_SqlRowsCopied);
                s.WriteToServer(dr);
                s.Close();
            }
        }

     

    With this code if source table have a GUID (in any format you want that a GUID could be expressed) then it fails with an invalid cast exception. Exception is not right however. The value is castable to a uniqueidentifier, it's just that the class use Convert.ChangeType which can't make a conversion. This is a bug IMHO.

     

    I thought I was clever anf to overcome it I used a datatable cheating:

    Code Snippet

    using System;
    using System.Data;
    using System.Data.SqlClient;
    using System.Data.OleDb;

    class sbc
    {
        static void Main(string[] args)
        {

            OleDbConnection source = new OleDbConnection(@"provider=vfpoledb;Data source=c:\temp\guids\guidtest.dbc");
            SqlConnection dest = new SqlConnection(@"server=.\sqlexpress;trusted_connection=yes;database=test;");

            DataTable tbl = new DataTable();

            dest.Open();

    // Read with a false where to grab schema from SQL server - like set fmtonly on/off
            SqlCommand cmd = new SqlCommand("select * from GuidTest where 1=2", dest);
            SqlDataReader sRdr = cmd.ExecuteReader();
            tbl.Load(sRdr);

    // Load from source into existing schema
            source.Open();
            OleDbCommand sourceCommand = new OleDbCommand(@"select * from " + args[0], source);
            OleDbDataReader dr = sourceCommand.ExecuteReader();
            tbl.Load(dr);
            source.Close();

            Console.WriteLine("Beginning Copy ....");

            try
            {
                using (SqlBulkCopy s = new SqlBulkCopy(dest))
                {
                    s.BatchSize = 500;
                    s.BulkCopyTimeout = 300;
                    s.DestinationTableName = "GuidTest";
                    s.NotifyAfter = 10;
                    s.SqlRowsCopied += new SqlRowsCopiedEventHandler(s_SqlRowsCopied);
                    s.WriteToServer(tbl);
                    s.Close();
                }
                Console.WriteLine("Copy complete");
            }
            catch (Exception e)
            {
                Console.WriteLine("{0} Exception caught", e);
            }
            finally
            {
                dest.Close();
            }

        }

        static void s_SqlRowsCopied(object sender, SqlRowsCopiedEventArgs e)
        {
            Console.WriteLine("-- Copied {0} rows.", e.RowsCopied);
        }
    }

     

    This works and the data gets copied to SQL server. However now the problem is I need to use a DataTable. It is very slow and I could live with that if it wouldn't give OutOfMemory exception on large files (well from SQL server point very small tables are sufficient to cause that expception - ie: a 300Mb dbf file can give that exception on a 2Gb RAM machine.).

     

    Is there a solution using 2.0?

     

    PS: Using linkedserver this really works fast and nicely but linkedservers are not enabled by default and customers think if MS didn't enebale it then I shouldn't even if temporarilySad

    Thanks in advance

     

     

    Update:

    I solved the issue this way:

    -Check field types from scheme table.

    -Has uniqueIdentifier?

    -No. Then go get using DataReader

    -Yes. Then go with datatable but read in junks, upload, clear.

     

     

    Friday, June 27, 2008 11:37 AM

All replies

  • Any reason not to use SSIS (formerly DTS) to copy your tables directly into SQL Server?

     

    Your DataTable solution should work if you change Load (get all rows in the table to memory) to an interactive loop, in which you upload rows 100 by 100, 1000 by 1000 or the like.

    Just don't load all the table, but using your reader, add rows to your datatable, then bulkload them, then clean the datatable and repeat.

     

    Bruno

    Sunday, July 13, 2008 3:56 AM
  • Yes there is a reasonSmile Being lazy and frsutrated correcting the code generated by wizard. Using DTS was easy. However with SSIS it is a little obscure. Generates "create" script for already existing tables. I was tired of manually finding and fixing the generated code and gave up after fixing half. Second reason I need to run it on remote clients changing parts, who already have the application running in DBF format.

     

    My solution is working. I am doing that as you say, little by little in case I need to a use a datatable (and honestly I find it awkward that I need to be on my toes for the data size of around 300Mb - it is not much at all).

     

    Do you have a suggestion to create a simple SSIS task in code and runnable from non-Net clients (namely VFP)? If you can show me just for a single table I could handle the rest.

    Tuesday, July 15, 2008 8:50 AM