How to use a single parameter file for multiple ADF jobs? RRS feed

  • Question

  • Is there a way to use a single parameter file for multiple ADF jobs? I could not find any sample structure of a parameter file that contain more parameter for more than one pipeline. 

    Also for a single ADF job, if I have more than one databricks notebook, how do i create parameters that are used in both notebooks with different value?

    Friday, August 31, 2018 6:59 PM

All replies

  • Hi, 

    For your second question, if your different notebooks just require different parameter value, I would suggest you use a foreach activity which contains the notebooks activity. And make your value an array. Iterate the array to get each value.

    Saturday, September 1, 2018 10:14 AM
  • Thank you for the reply. I am not sure how to achieve this in an array. The reason is, yes there are common parameters between notebooks but also there are also parameters that are specific to certain notebooks.

    The way i got this through is this.

    1)Created an on-prem table that contained ADF jobs and the parameter names with values.

    ADF JOB1 - paramname1 - valuex

    ADF JOB2 - paramname1 - valuey

    ADF jOB2 - paramname2 - valuez

    2) Created a small python script to generate json file converting this table into a dictionary. 

    3) Grouped notebooks in an ADF job

    4) Created base parameters in ADF notebook activity and linked them in actual notebook as db.utils get values.

    5) Called the ADF from on-prem using powershell along with the json file created.


    Monday, September 17, 2018 2:52 PM