locked
Copy Activity on multiple files in folder fails RRS feed

  • Question

  • Hi,

    I'm experiencing an issue where I receive the below error when doing a copy activity from a data lake folder to SQL Azure DB with multiple files. In past experience I have been able to solve the error by running the files individually to isolate which file is causing the error and adjusting my SQL Azure DB column accordingly. However in this case I can run each individual file without the error. I only receive the error when I do the copy activity at the folder level. <o:p></o:p>

    Has anyone experienced anything similar before? I'm at a loss on how to proceed at the moment (as I need to do the copy activity at the folder level)<o:p></o:p>

    "errorCode": "2200",

    "message": "'Type=System.InvalidOperationException,Message=The given value of type String from the data source cannot be converted to type nvarchar of the specified target column.,Source=System.Data,''Type=System.InvalidOperationException,Message=String or binary data would be truncated.,Source=System.Data,'",

    Thank you,

    Mats

    Thursday, February 21, 2019 1:24 PM

Answers

  • I was able to resolve this, however not very intuitively.

    During testing I changed my Fault Tolerance setting to "Skip incompatible rows" on my copy activity. I still ran the copy activity at the folder level, not looping through each file.

    This resulted in an error and failure (See below) on the second file, as data was copied for the first to files to my Azure SQL database. This error told me that something was going on with my BIGINT columns (int64), but not in Azure SQL, but rather in the source files. I found blank values in these columns, and Data Factory must have been unable to detect these when setting the datatype of int64. 

    "errorCode": "2200", "message": "'Type=System.FormatException,Message=Failed to convert parameter value from a String to a Int64.,Source=System.Data,''Type=System.FormatException,Message=Input string was not in a correct format.,Source=mscorlib,'"

    My solution, unfortunately, was to set a custom schema for the source and change the int64 columns to String to account for the blank values. 

    I'm still not sure how the initial error relates to the second error and the solution, but it's now working!

    • Marked as answer by Mats Elmsater Friday, February 22, 2019 12:08 AM
    Thursday, February 21, 2019 9:02 PM

All replies

  • I was able to resolve this, however not very intuitively.

    During testing I changed my Fault Tolerance setting to "Skip incompatible rows" on my copy activity. I still ran the copy activity at the folder level, not looping through each file.

    This resulted in an error and failure (See below) on the second file, as data was copied for the first to files to my Azure SQL database. This error told me that something was going on with my BIGINT columns (int64), but not in Azure SQL, but rather in the source files. I found blank values in these columns, and Data Factory must have been unable to detect these when setting the datatype of int64. 

    "errorCode": "2200", "message": "'Type=System.FormatException,Message=Failed to convert parameter value from a String to a Int64.,Source=System.Data,''Type=System.FormatException,Message=Input string was not in a correct format.,Source=mscorlib,'"

    My solution, unfortunately, was to set a custom schema for the source and change the int64 columns to String to account for the blank values. 

    I'm still not sure how the initial error relates to the second error and the solution, but it's now working!

    • Marked as answer by Mats Elmsater Friday, February 22, 2019 12:08 AM
    Thursday, February 21, 2019 9:02 PM
  • Hi Mats,

    Thank you for sharing your solution and experience here. It will be greatly beneficial for other community members who have similar questions.

    Friday, February 22, 2019 4:28 AM