none
How to do a Dynamic Column mapping in Copy Activity

    Question

  • I am copying from csv file to Azure sql table.

    for example csv file has 10 columns and Target table has 30 columns where there are no same column names , I have to map these columns dynamically using json string which can be added into mapping tab dynamic content. BUt I am not sure about the format which we have to give the mapping string.

    could anyone please help me with that

    Monday, April 30, 2018 8:30 AM

Answers

All replies

  • I think this document should help answer your question. The parameter columnMappings should do the trick that you are looking for. Let me know if you face any issues

    ____________________________________________________________________________________

    Please "Mark it as answer" if you find this answer helpful. This will help other community members



    Wednesday, May 2, 2018 12:33 AM
    Moderator
  • I am facing this same issue. My column mapping is driven from a configuration table in my database. I am trying to pass my column mapping into my copy activity, but I don't know what format it should be in.  I'm using the dynamic content window on the Mapping tab of the Copy activity, so my code looks like this:

    "translator": {
        "value": "@json(item().ColumnMapping)",
         "type": "Expression"
     }

    In my database, I don't know what format "ColumnMapping" should be. I've tried several but its not working.

    Tried like this:

    "columnMappings": {
        "view.url": "url",
        "view.name": "name"
    }


    and like this:

    {
        "view.url": "url",
        "view.name": "name"
    }


    and like this:

    {
        "type": "TabularTranslator",
        "columnMappings": {
          "view.url": "url",
          "view.name": "name"
        }
    }

    and several other ways....

    Monday, July 2, 2018 10:55 PM
  • Hi Jeffrey and Harish,

    To dynamically map column in copy activity, please define a parameter with type 'Object' in pipeline:

    "parameters": {

    "columnMapping": {

    "type": "Object"

    }

     }

     

    and reference this parameter in copy activity like:

    "translator": {

    "value": "@pipeline().parameters.columnMapping",

    "type": "Expression"

    }

    Please notice that you need to pass a JSON value to parameter. For example, I pass

    {

    "type": "TabularTranslator",

    "columnMappings": {

    "Prop_0": "Prop_0",

    "Prop_1": "Prop_1", "Prop_2": "Prop_2"

    }

    }

    to the parameter 'columnMapping'.

    Thanks.

    • Edited by Wang Zhang Monday, July 9, 2018 8:12 AM
    Monday, July 9, 2018 8:01 AM
  • Hi Wang Zhang,

    I have a similar requirement where i need to pass the column mapping dynamically from a SP to the copy activity. This copy activity will perform update operation in Dynamics CRM. The source is SQL server (2014) and sink is Dynamics CRM.

    I am fetching the column mapping from a stored procedure using look up activity and passing this parameter to copy activity.'

    When i directly provide the below mentioned json value as default value to the parameter, the copy activity is updating the mapped fields correctly.

    {"type":"TabularTranslator","columnMappings":{"leadid":"leadid","StateCode":"statecode"}}

    But when the json value is fetched from the SP , it is not working . I am getting the error ColumnName is read only. 

    Please suggest if any conversion is required on the output of the loopup activity before passing the parameter to copy activity. Below is the output of the lookup activity.

    {\"type\":\"TabularTranslator\",\"columnMappings\":{\"leadid\":\"leadid\",\"StateCode\":\"statecode\"}}

    Appreciate a quick turnaround.

    Thanks,

    Vinod


    VINOD P

    Thursday, November 22, 2018 7:08 AM
  • Hi there,

    I tested a scenario where a lookup activity get the columnMapping value then pass to copy activity, I see the lookup output is in type string and the copy activity still ran successfully, so I don't think any explicit type conversion is needed. How did you pass lookup output to copy activity, in 

    "@activity('Lookup1').output.firstRow.someProperty"?

    You could debug your pipeline following the workflow in the below picture:

    Here is the json definition in my side for test, you could take a reference:

    {
        "name": "CopyPipeline_42v",
        "properties": {
            "activities": [
                {
                    "name": "Lookup1",
                    "type": "Lookup",
                    "typeProperties": {
                        "source": {
                            "type": "BlobSource",
                            "recursive": true
                        },
                        "dataset": {
                            "referenceName": "AzureBlob7",
                            "type": "DatasetReference"
                        }
                    }
                },
                {
                    "name": "Copy Data1",
                    "type": "Copy",
                    "dependsOn": [
                        {
                            "activity": "Lookup1",
                            "dependencyConditions": [
                                "Succeeded"
                            ]
                        }
                    ],
                    "typeProperties": {
                        "source": {
                            "type": "SqlSource"
                        },
                        "sink": {
                            "type": "SqlSink",
                            "writeBatchSize": 10000
                        },
                        "enableStaging": false,
                        "dataIntegrationUnits": 0,
                        "translator": {
                            "value": "@activity('Lookup1').output.firstRow.Prop_1",
                            "type": "Expression"
                        }
                    },
                    "inputs": [
                        {
                            "referenceName": "AzureSqlTable2",
                            "type": "DatasetReference"
                        }
                    ],
                    "outputs": [
                        {
                            "referenceName": "AzureSqlTable2",
                            "type": "DatasetReference"
                        }
                    ]
                }
            ]
        },
        "type": "Microsoft.DataFactory/factories/pipelines"
    }
    Hope this helps.

    Thursday, November 22, 2018 8:32 AM
  • Thanks a lot Wang Zhang for quick turnaround.

    I will explain you pipeline now.

    1. In the lookup activity , i am executing a Stored procedure that will provide the columnmapping as shown below (See Preview).

    2.The debug of the lookup activity completed successfully and below is the output of the lookup activity.

    3. In the copy activity i am updating few fields in Dynamics 365. The source is SQL server and sink is Dynamics 365. The copy activity failed with the below error.

    4. Pipeline JSON Code

    Note: I am not facing this issue when i create a parameter (Name:ColumnMapping) of type object and assign the below default value.

    {"type":"TabularTranslator","columnMappings":{"leadid":"leadid","statecode":"statecode","statuscode":"statuscode"}}

    Copy activity is working fine when the column mapping is fetched from this parameter using the below dynamic Content.

    @pipeline().parameters.ColumnMapping

    But when i fetch the same default value from a SP in lookup activity , i see escape character (\) for each quotes in output. I guess this is the reason why the copy activity failed because the same copy activity successfully executed when the json result is provided without  (\) in the default value for parameter ColumnMapping.

    Please help me the solution.


    VINOD P

    Thursday, November 22, 2018 11:48 AM
  • Hi there, the error message is saying 'Column Datatype is readonly', I don't think escape char is the problem, it's just for the UI display and the value passed to copy activity is still {"type":"TabularTranslator","columnMappings":{"leadid":"leadid","statecode":"statecode","statuscode":"statuscode"}}.

    Let's take this offline, please email me the failed activity runID and pipeline runID: wangzhan@microsoft.com

    Thanks.

    Thursday, November 22, 2018 2:28 PM
  • Hi there,

    After an offline discussion with Access on-prem from ssis package hosted on azure, his issue has been resolved by passing expression "@json(activity('FetchingColumnMapping').output.firstRow.ColumnMapping)" to "translator" in copy activity. The root cause is the type mismatch between lookup activity output (string) and the translator (object), so an explicit type conversion is needed when passing parameter.

    Thanks.

    Monday, November 26, 2018 8:13 AM
  • @Wang Zhang

    I tried the exact steps, but I had schemaMapping (due to hierarchical source) instead of columnMapping. The pipeline completes successfully but there is only a single row inserted into the destination table. It seems like, for some reason, the schema mapping is not being picked up.


    Sohi

    16 hours 9 minutes ago