none
Copy Activity - Dynamic schema mapping RRS feed

  • Question

  • Hello,

              I am trying to copy data from a csv file stored on blob storage to an azure sql table. The file has 6 column ant the table 10 columns. I need to use dynamic schema mapping to copy only 2 columns from the file to the table, so I can do it for multiple files. I am using the following dynamic json for copy activity schema mapping, that is coming from a parameter:

    "translator": { "type": "TabularTranslator", "columnMappings": { "NAME": "NAME" }

    This is the copy activity code in debug:{

    "source": { "type": "DelimitedTextSource", "storeSettings": { "type": "AzureBlobStorageReadSettings", "recursive": false, "enablePartitionDiscovery": false }, "formatSettings": { "type": "DelimitedTextReadSettings" } }, "sink": { "type": "AzureSqlSink" }, "enableStaging": false, "translator": { "type": "TabularTranslator", "columnMappings": { "NAME": "NAME", "CODE": "CODE"

    } } }

     Unfortunately I am getting an error:

    "errorCode": "2200", "message": "ErrorCode=UserErrorStructureIsRequired,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The structure/schema setting in source is required for column mapping.,Source=Microsoft.DataTransfer.ClientLibrary,'", "failureType": "UserError"

    What am I doing wrong?


    Monday, September 16, 2019 1:29 PM

All replies

  • Hi Cristian,

    An ideal way to do so would be to import schemas in the mapping tab of the copy activity. Once that's done, you can de-select the columns you don't want to be copied. 

    You can also map the columns in the csv file to corresponding columns in the azure sql table.

    Here's a screenshot for your reference :

    Hope this helps.

    Tuesday, September 17, 2019 7:12 AM
    Owner
  •    Hello Mishra,

           Unfortunately, I cannot do that, because source and sink are dynamic and I don't know the name of the columns. I have many files in a folder and from each file I need to copy 2 fields in different tables, so source and sink are being filled by a lookup that gets the data from a configuration table. So, I want have only one pipeline ( a generic one ), not one pipeline per file.

    Thank you

    Tuesday, September 17, 2019 3:05 PM
  • Hi Cristian,

    In that case, did you have a look at Data Flow feature of Azure Data Factory ? It might be useful in your use case.

    Let us know if it helps. Else, we can gladly continue the dialogue.

    Wednesday, September 18, 2019 9:01 AM
    Owner
  • Hello,

       Unfortunately, we cannot use the data flow feature, as is not supporting self hosted integration runtime. So, does copy activity support this scenario?

    Thank you

    Wednesday, September 18, 2019 9:26 AM
  • Hi Cristian,

    In that case, you would have to go for schema mapping to include only the columns that you want.

    Thursday, September 19, 2019 8:46 AM
    Owner
  • Hi Cristian,

    Just wanted to check - was the above suggestion helpful to you? If yes, please consider upvoting and/or marking it as answer. This would help other community members reading this thread.
    Thursday, October 3, 2019 11:15 AM
    Owner
  • Hi Cristian,

    I haven't heard back from you in quite some time. Was the above suggestion helpful to you? If yes, please consider upvoting and/or marking it as answer. This would help other community members reading this thread.

    Wednesday, October 9, 2019 10:25 AM
    Owner