locked
Azure data factory Logging RRS feed

  • Question

  • I am loading data from Azure Blob store (ORC format ) to Azure SQL data warehouse. The data load is failing with the error as below. Is there any way I can get the column name in the error message- The table has 50 columns with the same datatype.

    Error: Error converting data type NVARCHAR to BIGINT.,Source=.Net SqlClient Data Provider,SqlErrorNumber=107090,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=107090,State=1,Message=Query aborted

    Tuesday, May 23, 2017 6:32 AM

Answers

  • ORC should be type-proof so I guess the issue is with your column definition in your output-Dataset / your SQL DW table

    I'd assume something is not matching here - could be the column order, mapping or of course the actual datatype


    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Friday, May 26, 2017 7:04 AM

All replies

  • ORC should be type-proof so I guess the issue is with your column definition in your output-Dataset / your SQL DW table

    I'd assume something is not matching here - could be the column order, mapping or of course the actual datatype


    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Friday, May 26, 2017 7:04 AM
  • Yes that was the case. The Java datatype String couldnt cast numeric data in a BigInt Column in database. Also timestamp in Hive is not compatible with datetimeoffset and had to be cast in datetime2 instead. After changing it data was loaded succesfully. Thanks. Is there any way to cast the data into specific datatype before loading it?
    Tuesday, May 30, 2017 3:13 PM
  • not really, you can either prepare the input-files accordingly or load every columns as VARCHAR(50) and do the conversion within SQL later on 

    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Wednesday, May 31, 2017 11:36 AM
  • I have this issue with source as on premise sql server and destination as ADW

    ERROR:-

    getting this error while using Azure Data Factory with SQL Error Number '100000'. Error message from database execution : PdwManagedToNativeInteropException ErrorNumber: 46724, MajorCode: 467, MinorCode: 24, Severity: 20, State: 2, Exception of type 'Microsoft.SqlServer.DataWarehouse.Tds.PdwManagedToNativeInteropException' was thrown..

    Monday, January 29, 2018 8:09 AM

  • I am getting the same error when I am trying to load data from Azure Blob Storage to Azure SQL Data warehouse.

    I was able to create the EXTERNAL table successfully. Source and column mapping are fine but when I query SELECT * for the data, it throws me below error.

    Msg 107090, Level 16, State 1, Line 55
    HdfsBridge::recordReaderFillBuffer - Unexpected error encountered filling record reader buffer: HadoopSqlException: Error converting data type NVARCHAR to BIGINT.

    There are 42 columns with same data type. Can you explain me the solution for this? All my files in Blob storage are in .txt files sample data is below.

    id source_id root_account_id account_id enrollment_term_id name
    78530000000000077 77 78530000000000001 78530000000000010 78530000000000001 Employee training
    78530000000000089 89 78530000000000001 78530000000000010 78530000000000001 SCORM TESTING

    Friday, April 19, 2019 7:39 PM