locked
AzureTable + DataFactory: azureTable Data Type Byte is not supported. Detailed RRS feed

  • Question

  • Running a Data factory pipeline I get some errors that prevented the import of MYSQL tables to AzureTable to complete successfully on all tables.
    For some tables I get the same error that is

    Failed execution
    Data Type Byte is not supported. Detailed Message: ErrorCode=DataTypeNotSupported,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Type 'System.Byte' is not supported by Azure Table. Check the supported data type in Azure Table document.,Source=,'

    A table that returns this error is like

    Field Type Null Key Default Extra

    id "int(10) unsigned" NO NULL
    is_val "tinyint(3) unsigned" YES NULL
    sig char(32) YES NULL
    contents "int(10) unsigned" YES NULL
    is_os "tinyint(3) unsigned" YES NULL
    lines "smallint(5) unsigned" YES NULL
    is_pds "tinyint(3) unsigned" YES NULL
    lang char(2) YES NULL
    is_dts varchar(32) YES NULL
    last_updated timestamp NO "0000-00-00 00:00:00"

    I do not get so which is the Data Type Byte not supported.

    Also, rescheduling the pipeline does not solve the issue, since it seems to be not temporary, but due to the scheme/table.

     

    • Moved by neerajkh_MSFT Monday, October 24, 2016 4:23 PM wrong forum
    Friday, October 21, 2016 5:07 PM

All replies

  • Hi Loreto,

    Could you provide me your PipelineId & RunId for further investigation?

    If using Azure Table as sink, this kind of error could happen under several cases, ex. the source column type is INT16.

    Tuesday, October 25, 2016 2:13 AM
  • Hello, the pipelineId was DevCatalogueLyricsSummary the runId was 8e8326d6-42e1-4e4e-ac84-d1ebb3be7972_636128640000000000_636129504000000000_OutputDataset-u0m

    Also I have got another error, that seems to be similar that is:

    Copy activity met storage operation failure at 'Sink' side. Error message from storage execution : Element 0 in the batch returned an unexpected response code. 0:One of the request inputs is not valid. RequestId:55c2c306-0002-0047-3f53-2e1f07000000 Time:2016-10-25T00:03:22.1518904Z.

    This error is for pipelineID: CopyPipeline-u0m and runID: 8e8326d6-42e1-4e4e-ac84-d1ebb3be7972_636128640000000000_636129504000000000_OutputDataset-u0m

    Tuesday, October 25, 2016 11:07 AM
  • Hi Loreto,

    Firstly, for the error 'Data Type Byte not supported', currently it's expected unsupported scenario. Azure Table doesn't support the type ex. Byte, Decimal, INT16, Timespan and Float. From your MySQL table schema definition, tinyint unsigned will be converted to System.Byte, and it’s not supported by Azure Table for writing.

    In near future, we are going to support type conversion, which will help this kind of case. We will keep you updated. Before that feature is ready, one potential workaround is that to firstly copy MySQL to Azure Blob, then copy from Azure Blob to Azure Table with source structure specified. And please specify compatible type supported by Azure Table in the structure information.

    Secondly, the error ‘Element 0 in the batch returned an unexpected response code. 0:One of the request inputs is not valid’ was thrown by Azure Table, one very likely reason is still ‘not supported type’. Could you help to double confirm whether all source MySQL table column types are supported?

    Friday, October 28, 2016 6:05 AM
  • For more details on copying from MySQL, and copying to Azure Table with Azure Data Factory, please refer to

    Move data From MySQL using Azure Data Factory

    Move data to and from Azure Table using Azure Data Factory

    Friday, October 28, 2016 6:09 AM
  • Hello Joyce,

    thank you for support, and adding support for those types in the next future would be super, since it's not that easy at this time to manage the Data Factory pipelines import from MySQL -> TableStorage having not those kind of data types.
    Our temporary solution, was to do not import those column basically to the Table Storage, but we were able to do since those data were not that relevant in the resulting schema- and of course this applies in this case only.
    Hence, I confirm that the first issue about `'Data Type Byte not supported',` was due to this, since removing some columns solved the issue. We have removed TinyInt, Char cols. After that it worked ok.

    Regarding the second issue, `Element 0 in the batch returned an unexpected response code. 0:One of the request inputs is not valid`we are still investigating, since we were not able to fix it in the same way. Specifically this happens in both these two scenery

    1- MySQL to TableStorage import

    2- TableStorage to TableStorage import

    Since it happens in 2) I'm not sure that it's related to unsupported type, at least not in that case, since I have a table already there. For 1) yes, it could be possible.


    If we get rid of it, I will let update here.

    Friday, October 28, 2016 9:34 AM
  • Hi Loreto,

    Thanks for reporting the progress to us. Hope that the issues have been resolved now.

    For 1), yes, please still check whether there is unsupported type exists in MySQL. If the issue still exists, please open Data Management Gateway Configuration Manager to enable verbose logging in the tab Diagnostics. You could further send logs and request id to us for further troubleshooting.

    For 2), have you ever try to specify structure <name & type> for both source & target Azure Table. Since that Azure Table is non-SQL data connectors, ADF today use the data types calculated based on first several rows as the schema, which possibly doesn't apply to all cases. Please send us the runId for further troubleshooting if needed.

    Tuesday, November 1, 2016 6:47 AM