none
Azure Data Factory v2 - Copy from Blob storage fails on empty Containers

    Question

  • Hi all,

    I have a pipeline that copies files from a Storage Account Container to an other Storage Account.

    But when the Container is empty the job fails with error:

    The required Blob is missing.

    That's true as there are no files in the container.

    How can I change that the task reports successfully when a Container is empty?


    Basic Cloud | Twitter

    Monday, September 3, 2018 11:09 AM

All replies

  • Hi RalJans,

    You could firstly use a Get Metadata activity and an If Condition activity to check whether the blob exists, and then copy data to sink storage only if the blob exists. Please refer the following example:

    {
        "name": "pipeline1",
        "properties": {
            "activities": [
    {
                    "name": "Get Metadata1",
                    "type": "GetMetadata",
                    "typeProperties": {
                        "dataset": {
                            "referenceName": "AzureBlob1",
                            "type": "DatasetReference"
                        },
                        "fieldList": [
                            "exists"
                        ]
                    }
                },
                {
                    "name": "If Condition1",
                    "type": "IfCondition",
                    "dependsOn": [
                        {
                            "activity": "Get Metadata1",
                            "dependencyConditions": [
                                "Succeeded"
                            ]
                        }
                    ],
                    "typeProperties": {
                        "expression": {
                            "value": "@activity('Get Metadata1').output.exists",
                            "type": "Expression"
                        },
                        "ifTrueActivities": [
                            {
                                "name": "Copy Data1",
                                "type": "Copy",
                                "typeProperties": {
                                    "source": {
                                        "type": "BlobSource",
                                        "recursive": true
                                    },
                                    "sink": {
                                        "type": "BlobSink"
                                    },
                                    "enableStaging": false,
                                    "dataIntegrationUnits": 0
                                },
                                "inputs": [
                                    {
                                        "referenceName": "AzureBlob1",
                                        "type": "DatasetReference"
                                    }
                                ],
                                "outputs": [
                                    {
                                        "referenceName": "AzureBlob2",
                                        "type": "DatasetReference"
                                    }
                                ]
                            }
                        ]
                    }
                }
            ]
        }
    }

    Monday, September 3, 2018 1:12 PM