none
I have a Pipeline with AppendVariable and Set variable activities used in that but when i deployed through CI/CD its showing like Could not load the resource like RRS feed

  • Question

  • Hi All,

    Could any one help me on this please like i have one pipeline where i have getMatadata -> foreach-> If condition -> Append variable -> Set variable -> filter -> set variable -> Execute pipeline as my pipeline flow with these activities .

    its working fine in DEV but when i deployed the same pipeline in QA through CI/CD ( powershell via VSTS ) i deployed but showing error saying " Could not load the resource "

    I could see one thing deployed json inside activity type for "AppenVariable" and "Setvariable " is showing as "Activity" , may be causing the issue .

    but what would have been happened when i deployed through Ci/CD causing this issue ,,,,,,Any idea please......

    --Poornima M

    Friday, January 11, 2019 7:05 PM

All replies

  • Hi Poornima,

    This error usually occurs when there's a problem in your JSON. Most likely you have some invalid chars at the start of the JSON. 

    I would recommend you to go through the doc for CI/CD and check if you have missed something :

    https://docs.microsoft.com/en-us/azure/data-factory/continuous-integration-deployment

    Also there's a similar post here :

    https://social.msdn.microsoft.com/Forums/en-US/dba84ab4-beb5-4a2f-a106-9fa01e43f5ec/df-v2-all-pipelines-and-data-sets-now-saying-this-resource-references-resources-which-could-not?forum=AzureDataFactory

    The proposed answer here might be helpful in your case as well.

    Let us know if this helps. Else we can continue to probe in further.


    MSDN

    Monday, January 14, 2019 6:51 AM
    Owner
  • Hi  ,

    Thanks for responding .

    I have already gone through the link which says if any invalid char is causing this issue .

    Even I have more expressions/formulas used in SetVariable and appedn variable activities.

    Like a character which has special meaning in powershell may be causing this but I could not find which exact char is causing this . Could any one see the JSON and if possible could some one look into it and throw some light please.

    {
    "name": "PL_MANUAL_Innovations_FindFileToCopy",
    "properties": {
    "activities": [
    {
    "name": "AT_ReadFileNames",
    "type": "GetMetadata",
    "policy": {
    "timeout": "7.00:00:00",
    "retry": 0,
    "retryIntervalInSeconds": 30,
    "secureOutput": false,
    "secureInput": false
    },
    "typeProperties": {
    "dataset": {
    "referenceName": "DS_FILEFMT_Metadata_IN",
    "type": "DatasetReference"
    },
    "fieldList": [
    "childItems"
    ]
    }
    },
    {
    "name": "AT_ForEach_FileName",
    "type": "ForEach",
    "dependsOn": [
    {
    "activity": "AT_ReadFileNames",
    "dependencyConditions": [
    "Succeeded"
    ]
    }
    ],
    "typeProperties": {
    "items": {
    "value": "@activity('AT_ReadFileNames').output.childitems",
    "type": "Expression"
    },
    "activities": [
    {
    "name": "AT_IF_Source",
    "type": "IfCondition",
    "typeProperties": {
    "expression": {
    "value": "@contains(toUpper(item().name),toUpper(pipeline().parameters.Source))",
    "type": "Expression"
    },
    "ifTrueActivities": [
    {
    "name": "AT_Append_Integers",
    "type": "AppendVariable",
    "typeProperties": {
    "variableName": "fileNos",
    "value": {
    "value": "@int(substring( item().name, add(lastindexof(item().name, '_'), 1), sub(indexof(item().name, '.'),add(lastindexof(item().name, '_'), 1))))",
    "type": "Expression"
    }
    }
    },
    {
    "name": "AT_Append_FileNames",
    "type": "AppendVariable",
    "typeProperties": {
    "variableName": "fileNames",
    "value": {
    "value": "@item().name",
    "type": "Expression"
    }
    }
    }
    ]
    }
    }
    ]
    }
    },
    {
    "name": "AT_Set_MaxInteger",
    "type": "SetVariable",
    "dependsOn": [
    {
    "activity": "AT_ForEach_FileName",
    "dependencyConditions": [
    "Succeeded"
    ]
    }
    ],
    "typeProperties": {
    "variableName": "maxValue",
    "value": {
    "value": "@string(max(variables('fileNos')))",
    "type": "Expression"
    }
    }
    },
    {
    "name": "AT_Filter_FileName_To_Copy",
    "type": "Filter",
    "dependsOn": [
    {
    "activity": "AT_Set_MaxInteger",
    "dependencyConditions": [
    "Succeeded"
    ]
    }
    ],
    "typeProperties": {
    "items": {
    "value": "@variables('fileNames')",
    "type": "Expression"
    },
    "condition": {
    "value": "@contains(item(),variables('maxValue'))",
    "type": "Expression"
    }
    }
    },
    {
    "name": "AT_Set_FileName_To_Copy",
    "type": "SetVariable",
    "dependsOn": [
    {
    "activity": "AT_Filter_FileName_To_Copy",
    "dependencyConditions": [
    "Succeeded"
    ]
    }
    ],
    "typeProperties": {
    "variableName": "file",
    "value": {
    "value": "@activity('AT_Filter_FileName_To_Copy').output.value[0]",
    "type": "Expression"
    }
    }
    },
    {
    "name": "AT_EXE_PL_MANUAL_Innovations_LANDED",
    "type": "ExecutePipeline",
    "dependsOn": [
    {
    "activity": "AT_Set_FileName_To_Copy",
    "dependencyConditions": [
    "Succeeded"
    ]
    }
    ],
    "typeProperties": {
    "pipeline": {
    "referenceName": "PL_SC_NA_D_Innovations_LANDED",
    "type": "PipelineReference"
    },
    "waitOnCompletion": true,
    "parameters": {
    "SliceStartTime": {
    "value": "@pipeline().parameters.SliceStartTime",
    "type": "Expression"
    },
    "CustomSliceStart": {
    "value": "@pipeline().parameters.CustomSliceStart",
    "type": "Expression"
    },
    "ToBeResumed": {
    "value": "@pipeline().parameters.ToBeResumed",
    "type": "Expression"
    },
    "ParentBatchId": {
    "value": "@pipeline().parameters.ParentBatchId",
    "type": "Expression"
    },
    "LoadType": {
    "value": "@pipeline().parameters.LoadType",
    "type": "Expression"
    },
    "PreviousSliceDateTime": {
    "value": "@pipeline().parameters.PreviousSliceDateTime",
    "type": "Expression"
    },
    "File": {
    "value": "@variables('file')",
    "type": "Expression"
    },
    "Source": {
    "value": "@pipeline().parameters.Source",
    "type": "Expression"
    }
    }
    }
    }
    ],
    "parameters": {
    "SliceStartTime": {
    "type": "String",
    "defaultValue": "2018-01-01"
    },
    "CustomSliceStart": {
    "type": "Bool",
    "defaultValue": false
    },
    "ToBeResumed": {
    "type": "Bool",
    "defaultValue": false
    },
    "ParentBatchId": {
    "type": "Int",
    "defaultValue": 1
    },
    "LoadType": {
    "type": "String",
    "defaultValue": "Incremental"
    },
    "PreviousSliceDateTime": {
    "type": "String",
    "defaultValue": "PreviousSliceDateTime"
    },
    "Source": {
    "type": "String"
    }
    },
    "variables": {
    "fileNos": {
    "type": "Array"
    },
    "maxValue": {
    "type": "String"
    },
    "fileNames": {
    "type": "Array"
    },
    "file": {
    "type": "String"
    }
    },
    "folder": {
    "name": "FIleFMT"
    }
    },
    "type": "Microsoft.DataFactory/factories/pipelines"
    }

    Yes I understood that something is making the json invalid and causing this issue but need to understand what is the culprit.

    Help me on this please 

    --Poornima M

    Tuesday, January 22, 2019 5:04 PM