locked
Getting [Please make sure container name is specified for staging linked service AzureBlobStorage","Details"] Error on all Data Factory pipelines RRS feed

  • Question

  • Environment: Azure Data Factory

    Issue:

    Getting below error while running all pipelines on azure data factory. Nothing has been change since last 2-3 month, and suddenly this morning getting this error on all pipelines.

    Operation on target*************** failed: {"StatusCode":"DFExecutorUserError","Message":"com.microsoft.dataflow.broker.InvalidOperationException: com.microsoft.dataflow.broker.InvalidOperationException: Please make sure container name is specified for staging linked service AzureBlobStorage"}

    Any clue?


    Avi

    Monday, February 3, 2020 2:39 AM

Answers

All replies

  • Hi Avi,

    Did you try adding a container name in your Linked Service "AzureBlobStorage" ? That should fix the problem. I will check internally with the team if there has been a recent deployment that has made it mandatory to enter a container name in a blob storage Linked Service.

    Hope this helps.

    Monday, February 3, 2020 7:02 AM
  • Hi Chirag,

    I did check and all pipelines were working fine until today. We were using Blob Container name but without any specific sub blob folder or virtual folder.

    For a testing a try by creating a virtual folder. and added in path with blob container. Surprisingly it worked this way.

    However, still unknown if there is no change on our side. why did this not happen yesterday or days before yesterday.

    Looking forward for your feedback.

    Thank You.


    Avi

    Monday, February 3, 2020 8:35 AM
  • I have the same issue with some data flows in existing pipelines that ran without issue before. I was away for a couple weeks and now they fail with this error. My linked service already has a folder defined for polybase staging and it still fails. The pipelines already in production are working so not sure why these in Dev are now unable to run. Would like to hear what might be behind this change in behavior and if there is a workaround or solution. Thanks.

    Brad

    • Edited by BGreene2020 Monday, February 3, 2020 11:36 PM
    • Proposed as answer by BGreene2020 Tuesday, February 4, 2020 12:00 AM
    • Unproposed as answer by BGreene2020 Tuesday, February 4, 2020 12:01 AM
    Monday, February 3, 2020 11:35 PM
  • Hi all,

    This is Dan from the ADF product team. We are investigating this issue and will update this thread once we find out what happened. Apologies for the inconvenience.

    A workaround is to make sure a directory is specified in your staging options in the data flow activity.

    Tuesday, February 4, 2020 12:06 AM
  • I just looked at the JSON of a pipeline that is working and compared the Data Flow's linked service definition with one of the ones that is failing and found the following difference shown below. After editing the JSON to add the "/undefined" subfolder, which must have been added programmatically previously, the pipeline now succeeds. It would be good to know if this is a defect or other. My new pipelines are not getting the "/undefined".

                        "dataflow": {
                            "referenceName": "df_Load_EntitlementData",
                            "type": "DataFlowReference"
                        },
                        "staging": {
                            "linkedService": {
                                "referenceName": "KMPData_BlobStorage",
                                "type": "LinkedServiceReference"
                            },
                            "folderPath": "polybasestaging/undefined"
                        },

    Tuesday, February 4, 2020 12:18 AM
  • We have rolled back some changes and you should no longer encounter this in production environments.
    Tuesday, February 4, 2020 12:29 AM