locked
Running Spark Jobs in sequence using Azure Data Factory V1 RRS feed

  • Question

  • Suppose there are 3 spark jobs - S1,S2 and S3 .

    We need to run these jobs in the following way

    First job S1 is run and when it is completed successfully then another spark job S2 should be invoked .

    On the other hand ,if job S1 fails then job S3 should be invoked .

    How this can be done using Azure Data Factory V1 ?

    Wednesday, October 24, 2018 1:10 PM

All replies

  • Hello,

    I don't think V1 will support the granular control you are seeking.  Are you integrating these activities into an existing V1 pipeline?  Is there any other reason you cannot use V2?

    • Proposed as answer by Jason_J (Azure) Wednesday, October 24, 2018 4:24 PM
    Wednesday, October 24, 2018 4:24 PM