locked
Copy Activity - Blob to Azure SQL - Byte[] column Issue RRS feed

  • Question

  • Hello Team,

    There is a requirement to create an MD5 Hash and store in CSV file, which then copied to Azure SQL Table.

    I am facing the issue that when ADF COPY activity copies file to SQL table, for the Byte[] column It misses out first 2 letters and add "0x" in beginning in SQL table.

    ADF Data Sets:-

    OutputFile:

    {
        "name": "DS_ADLS_OutputFile",
        "properties": {
            "structure": [
                {
                    "name": "OID","type": "String"
                },
                {
                    "name": "BusinessDate","type": "Int32"
                },
                {
                    "name": "VID","type": "Byte[]"
                },
                {
                    "name": "RSuite","type": "String"
                }
            ],
            "published": false,
            "type": "AzureDataLakeStore",
            "linkedServiceName": "LS_ADLS",
            "typeProperties": {
                "fileName": "OutputFile.csv",
                "folderPath": "test/v1/{Year}/{Month}/{Day}/csv/",
                "format": {
                    "type": "TextFormat",
                    "rowDelimiter": "\n",
                    "columnDelimiter": ",",
                    "quoteChar": "\"",
                    "treatEmptyAsNull": true,
                    "skipLineCount": 0,
                    "firstRowAsHeader": false
                },
                "partitionedBy": [
                    {
                        "name": "Year",
                        "value": {
                            "type": "DateTime",
                            "date": "SliceStart",
                            "format": "yyyy"
                        }
                    },
                    {
                        "name": "Month",
                        "value": {
                            "type": "DateTime",
                            "date": "SliceStart",
                            "format": "MM"
                        }
                    },
                    {
                        "name": "Day",
                        "value": {
                            "type": "DateTime",
                            "date": "SliceStart",
                            "format": "dd"
                        }
                    }
                ]
            },
            "availability": {
                "frequency": "Day",
                "interval": 1
            },
            "external": false,
            "policy": {}
        }
    }

    SQL Table:

    {
        "name": "DS_AzureSqlOutput",
        "properties": {
            "structure": [
                {
                    "name": "OID","type": "String"
                },
                {
                    "name": "VID","type": "Byte[]"
                },
                {
                    "name": "BusinessDate","type": "Int32"
                },
                {
                    "name": "RSuite","type": "String"
                }
            ],
            "published": false,
            "type": "AzureSqlTable",
            "linkedServiceName": "LS_AzureSQL",
            "typeProperties": {
                "tableName": "dbo.test"
            },
            "availability": {
                "frequency": "Day",
                "interval": 1
            }
        }
    }

    Input File Data :

    OID,BusinessDate,VID,RSuite
    023VV4ZH7K5L,20170611,37EC6F4AF161DB8EE12424202D649E95,testprod

    ------------------------------------------------------------------

    SQL Table Output: (VID in SQL table is varbinary column)

    OID VID BusinessDate RSuite
    023VV4ZH7K5L 0xEC6F4AF161DB8EE12424202D649E95 20170611 testprod

    ----------------------------------------------------------------------------------

    I am not sure If we need Input file in the specific format? OR there is some issue in ADF type conversion?

    Can anyone help?

    Regards,

    Nimesh




    Tuesday, July 25, 2017 4:36 PM