none
Powershell CSV-Import to MS SQL Server - Exception calling "WriteToServer" => The given value of type decimal ... cannot be converted ... decimal of the specified target column.") RRS feed

  • Question

  • Hello everybody,

    I am trying to copy a flat csv-file into a table in the MS SQL Server (2014, 64bit). The problem is, I get an exception calling "WriteToServer" with "1" argument(s) and a very interesting error message which says, that the given value of type decimal cannot be converted to decimal.

    I tried the following which works:

    => Create a table and define all attributes as VarChar.

    Unfortunately I cannot sum up over strings...

    Initially I got a similar error message which indicated, that strings cannot be converted to decimal as the numbers in the csv-file were comma-separated. So I converted the numbers (via the sed-command). But now I get the error message indicating that decimal cannot be converted to decimal - that makes no sense in my opinion.

    I tried without success to modify the table-definition and to increase the number of decimal places, but that had no Impact.

    Do you know what the problem might be? See below the code in powershell.

    Many thanks!

    Kind regards,

    Tester

    #################################################### 
    # 
    # PowerShell CSV to SQL Import Script 
    # 
    #################################################### 
     
    # Database and csv-variables 
    
    
    
    $csvfile = "c:\testfile.csv"
    $csvdelimiter = ";"
    $sqlserver = "SERVER\MASTER"
    $database = "TEST_MASTER"
    $table = "TEST_MASTER.TEST.Testtable"
    $firstrowcolumnnames = $true
    
      
    #################################################### 
    # 
    # No additional changes are required below unless 
    # you want to modify your sqlbulkcopy options or 
    # your SQL authentication details 
    # 
    #################################################### 
    $i = 0
     
    Write-Output "Script started..." 
    $elapsed = [System.Diagnostics.Stopwatch]::StartNew() 
     
    # 100k worked fastest and kept memory usage to a minimum 
    $batchsize = 100000 
     
    # Build the sqlbulkcopy connection, and set the timeout to infinite 
    $connectionstring = "Data Source=$sqlserver;Integrated Security=true;Initial Catalog=$database;" 
    $bulkcopy = new-object ("Data.SqlClient.Sqlbulkcopy") $connectionstring 
    $bulkcopy.DestinationTableName = $table 
    $bulkcopy.bulkcopyTimeout = 0 
    $bulkcopy.batchsize = $batchsize 
    $bulkcopy.EnableStreaming = 1 
      
    # Create the datatable, and autogenerate the columns. 
    $datatable = New-Object "System.Data.DataTable" 
     
    # Open the text file from disk 
    # $reader = gc -TotalCount 1000 ($csvfile) 
    $reader = new-object System.IO.StreamReader($csvfile) 
    $line = $reader.ReadLine() 
    $columns =  $line.Split($csvdelimiter) 
     
        if ($firstrowcolumnnames -eq $false) { 
            foreach ($column in $columns) { 
                $null = $datatable.Columns.Add() 
                } 
            # start reader over 
            $reader.DiscardBufferedData();  
            $reader.BaseStream.Position = 0; 
            } 
        else { 
            foreach ($column in $columns) { 
                $null = $datatable.Columns.Add($column) 
            } 
        } 
     
      # Read in the data, line by line
      
        $reader.ReadLine()
        while (($line = $reader.ReadLine()) -ne $null)  { 
            $row = $datatable.NewRow() 
            $row.itemarray = $line.Split($csvdelimiter)
            $datatable.Rows.Add($row)
            # Once you reach your batch size, write to the db,  
            # then clear the datatable from memory 
            $i++; if (($i % $batchsize) -eq 0) { 
            $bulkcopy.WriteToServer($datatable) 
            Write-Output "$i rows have been inserted in $($elapsed.Elapsed.ToString())."; 
            $datatable.Clear() 
            } 
        }  
     
    # Close the CSV file 
    $reader.Close() 
     
        # Add in all the remaining rows since the last clear 
        if($datatable.Rows.Count -gt 0) { 
            $bulkcopy.WriteToServer($datatable) 
            $datatable.Clear() 
        } 
     
    # Sometimes the Garbage Collector takes too long. 
    [System.GC]::Collect()     
     
    Write-Output "Script complete. $i rows have been inserted into the database." 
    Write-Output "Total Elapsed Time: $($elapsed.Elapsed.ToString())"

    Tuesday, August 28, 2018 5:13 PM

All replies

  • Hi SQL_Tester,

    The following thread share two methods about handling decimal in CSV.

    1. using SQL statement:

     
    CREATE Table ImportTable (Value NVARCHAR(1000))   --<-- Import the data as it is
    INSERT INTO @TABLE VALUES
    ('1234,34112'),('12651635,68466'),('1234574,5874')
    
      /* Alter table add column of NUMERIC type.
         And update that column something like this... 
       */
    
    UPDATE ImportTable 
    SET NewColumn =  CAST(REPLACE(Value , ',', '.') AS NUMERIC(28,8)) 

    2. change it your excel sheet before you import it. 

    https://stackoverflow.com/questions/21245376/import-datatype-decimal-from-csv-to-sql-server

    Best regards,

    Zhanglong


    MSDN Community Support
    Please remember to click "Mark as Answer" the responses that resolved your issue, and to click "Unmark as Answer" if not. This can be beneficial to other community members reading this thread. If you have any compliments or complaints to MSDN Support, feel free to contact MSDNFSF@microsoft.com.

    Wednesday, August 29, 2018 2:23 AM
    Moderator
  • Hi Zhanglong,

    Thanks a lot for your prompt answer.

    Both are no realistic options:

    1) The conversion of the data I already performed via sed s^,^.^g - so there is no more need to convert the data in dot-separated decimals. A conversion in excel is not realistic as I have a flat table with > 20.000.000 rows.

    2) The alter table add columns -> copy converted content - method costs too much time - I need a faster method.

    The question is why I cannot load data with correct format into the MS SQL Server. Do you maybe know what the problem is in that context?

    Many thanks!

    Regards,

    Tester

    Wednesday, August 29, 2018 7:12 AM
  • Hi SQL_Tester,

    Could you please share your table structure of your sql server and a litter records of CSV, which could reproduce the issue.

    Best regards,

    Zhanglong


    MSDN Community Support
    Please remember to click "Mark as Answer" the responses that resolved your issue, and to click "Unmark as Answer" if not. This can be beneficial to other community members reading this thread. If you have any compliments or complaints to MSDN Support, feel free to contact MSDNFSF@microsoft.com.

    Thursday, August 30, 2018 7:54 AM
    Moderator