locked
Azure Portal: How to pass docker related setting from Azure Data Factory to Azure Batch RRS feed

  • Question

  • I would like to use Azure Data Factory to trigger a Azure Batch service which running with Docker container,

    I have:

    1) Created a Pool with pre-fetched Docker image

    2) Created Job & Task on the Pool

    3) Run my task with Docker image and the result is good.

    But When I trigger my batch job from Azure Data Factory, I got error:

    Task failed "Container-enabled compute node requires task container settings"

    I know we need to pass the parameters like "Image name"  to a task like the following:

    But how to pass such settings from ADF?

    or

    How to config ADF linked Batch service so I can trigger Docker application on Batch?

    Wednesday, November 13, 2019 7:36 PM

All replies

  • We are reaching out internally to get help on this . We will update once we hear back from the internal team .

    Thanks Himanshu

    Thursday, November 14, 2019 6:10 PM
  • Hello , 

    Just wanted to let you know that  Dockers containers are not supported at this time .

    If you want this feature , please add the details  at https://feedback.azure.com/forums/270578-data-factory


    Thanks Himanshu

    Wednesday, November 20, 2019 12:51 AM