locked
Data Factory V2 Copy Activity erroring Bulk Copy failed due to received an invalid column length from the bcp RRS feed

  • Question

  • Im certain that there are now data type issues in the data

    "errorCode": "2200", "message": "ErrorCode=UserErrorSqlBulkCopyInvalidColumnLength,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=SQL Bulk Copy failed due to received an invalid column length from the bcp client.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=The service has encountered an error processing your request. Please try again. Error code 4815.\r\nA severe error occurred on the current command. The results, if any, should be discarded.,Source=.Net SqlClient Data Provider,SqlErrorNumber=40197,Class=20,ErrorCode=-2146232060,State=1,Errors=[{Class=20,Number=40197,State=1,Message=The service has encountered an error processing your request. Please try again. Error code 4815.,},{Class=20,Number=0,State=0,Message=A severe error occurred on the current command. The results, if any, should be discarded.,},],'", "failureType": "UserError", "target": "Copy Data1"

    I cant see any issues with invalid column lengths but its not really giving me enough information to work with 

    I have made sure my columns are in the same order and both have 14 columns

    Is there anything else I can do to figure out why this is erroring?


    Debbie

    Monday, March 4, 2019 2:05 PM

Answers

  • You are getting the error because the length of the data coming into destination column is bigger then the column size defined . I could suggest the following and you can read more here on the datatypes . 

    1.Define data type as varchar(a) , where a is ball park figure of the length . I think your pipeline is failing for varchar(255) as of now , but may work for varchar(500) , you will have to take a call on that . In the worst case scenario you can use varchar(max) 

    2.If you plan for use nvarchar , please define the column with nvarchar(a) where a is ball park figure of the length. If you wish you can also use nvarchar(max) . 


    Thanks Himanshu

    Thursday, March 21, 2019 9:02 PM

All replies

  • Found out some more about this. 

    Its because Im using VARCHAR 255.  I think I need to change to NVARCHAR. However if I change Azure SQL Database to NVARCHAR I dont get the above error but Data Factory hangs. Has anyone else seen this issue. It wont let me import to varchar but NVARCHAR just hangs and doesnt run


    Debbie


    Monday, March 4, 2019 3:13 PM
  • You are getting the error because the length of the data coming into destination column is bigger then the column size defined . I could suggest the following and you can read more here on the datatypes . 

    1.Define data type as varchar(a) , where a is ball park figure of the length . I think your pipeline is failing for varchar(255) as of now , but may work for varchar(500) , you will have to take a call on that . In the worst case scenario you can use varchar(max) 

    2.If you plan for use nvarchar , please define the column with nvarchar(a) where a is ball park figure of the length. If you wish you can also use nvarchar(max) . 


    Thanks Himanshu

    Thursday, March 21, 2019 9:02 PM