none
copy activity :Error converting datatype from VARCHAR to DECIMAL

    Question

  • Hi,

    i am getting the below error in the copy acitivity while moving the data from onpremises sql server to Azure SQL DW.

    FYI, i have enabed the polybase to copy.

    we only have one column with (NUMERIC(22,0)) datatype in the table and all other columns are nvarchar(4).

    Did someone experience the similar error? thanks

    {
        "errorCode": "2200",
        "message": "ErrorCode=FailedDbOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error happened when loading data into SQL Data Warehouse.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,Source=.Net SqlClient Data Provider,SqlErrorNumber=107090,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=107090,State=1,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,},],'",
        "failureType": "UserError",
        "target": "Copy_xzd"
    }

    {
        "errorCode": "2200",
        "message": "ErrorCode=FailedDbOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error happened when loading data into SQL Data Warehouse.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,Source=.Net SqlClient Data Provider,SqlErrorNumber=107090,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=107090,State=1,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,},],'",
        "failureType": "UserError",
        "target": "Copy_xzd"
    }{ "errorCode": "2200", "message": "ErrorCode=FailedDbOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error happened when loading data into SQL Data Warehouse.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,Source=.Net SqlClient Data Provider,SqlErrorNumber=107090,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=107090,State=1,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,},],'", "failureType": "UserError", "target": "Copy_xzd" }{ "errorCode": "2200", "message": "ErrorCode=FailedDbOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error happened when loading data into SQL Data Warehouse.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,Source=.Net SqlClient Data Provider,SqlErrorNumber=107090,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=107090,State=1,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,},],'", "failureType": "UserError", "target": "Copy_xzd" }zxc{ "errorCode": "2200", "message": "ErrorCode=FailedDbOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error happened when loading data into SQL Data Warehouse.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,Source=.Net SqlClient Data Provider,SqlErrorNumber=107090,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=107090,State=1,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,},],'", "failureType": "UserError", "target": "Copy_xzd" }
    { "errorCode": "2200", "message": "ErrorCode=FailedDbOperation,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Error happened when loading data into SQL Data Warehouse.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Data.SqlClient.SqlException,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,Source=.Net SqlClient Data Provider,SqlErrorNumber=107090,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=107090,State=1,Message=HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type VARCHAR to DECIMAL.,},],'", "failureType": "UserError", "target": "Copy_xzd" }


    Sunday, May 19, 2019 7:48 PM

Answers

  • Hi Clokeshreddy,

    The right way to do is to unselect "Use type default" option (as false) in copy activity sink -> PolyBase setings. "USE_TYPE_DEFAULT" is a PolyBase native configuration which specifies how to handle missing values in delimited text files when PolyBase retrieves data from the text file. For more info, please refer this doc.

    To handle null values, copy activity provides sqlWriterStoredProcedure in SQL database sink, you can define how to apply data in TSV file into your target table in the store procedure. For example, you could replace the NULL value with 0 in the store procedure, which depends on your business logic.

    Also, you might want to go for a staged copy since SQL server is not a directly supported source with Polybase. To read more about it, please refer this doc.

    Hope this helps.

    Friday, May 24, 2019 9:41 AM
    Owner

All replies

  • Hi There,

    Have you tried set "useTypeDefault" to false in the "polybaseSettings"? Please try this out and let us know if it works.

    Also, please check the schema mapping in the mappings tab in the copy activity to make sure there's no incompatibility.

    Monday, May 20, 2019 9:09 AM
    Owner
  • Hi Chirag,

    i have already set the "useTypeDefault" to False in the polybase settings but still i see the failure.

    Also i am not doing the schema mapping for the  tables because i am loading all the tables from the source database to Azure SQL DW in one pipeline.

    i am using the below template for this pipeline 

    https://docs.microsoft.com/en-us/azure/data-factory/solution-template-bulk-copy-with-control-table

    Please let me know if you need any more info from my end,thanks

    Regards

    CLR

    Tuesday, May 21, 2019 12:48 AM
  • Hi clokeshreddy,

    Can you please try it out by unchecking "Allow Polybase" in "Sink" tab in Copy Activity properties. There are some restrictions in Polybase which might be causing this issue.

    Wednesday, May 22, 2019 7:13 AM
    Owner
  • Hi Chirag,

    when i try "Allow Polybase" for the pipeline this table will load fine with no issues but the other tables (3-4) tables are failing.

    its mainly because of null values in numeric datatype once it is stored in blob and then converting the nullable column to decimal is throwing exception.

    For now i have changed the source query for those columns to check if its null replace to zero but this is just a temporary solution as i want to create one pipeline to get all the tables from source without depending on the schema for a database.

    Do you think is there any other way to implement this? 

    Thursday, May 23, 2019 10:02 AM
  • Hi Clokeshreddy,

    The right way to do is to unselect "Use type default" option (as false) in copy activity sink -> PolyBase setings. "USE_TYPE_DEFAULT" is a PolyBase native configuration which specifies how to handle missing values in delimited text files when PolyBase retrieves data from the text file. For more info, please refer this doc.

    To handle null values, copy activity provides sqlWriterStoredProcedure in SQL database sink, you can define how to apply data in TSV file into your target table in the store procedure. For example, you could replace the NULL value with 0 in the store procedure, which depends on your business logic.

    Also, you might want to go for a staged copy since SQL server is not a directly supported source with Polybase. To read more about it, please refer this doc.

    Hope this helps.

    Friday, May 24, 2019 9:41 AM
    Owner
  • Thanks Chirag this helps.
    Saturday, June 8, 2019 4:53 PM