Data Factory Transactions and syncronization RRS feed

  • Question

  • Hello all, 

    We have started using Azure Data Factory recently and created pipelines to do a variety of things such as call sprocs and move data between two tables in two different databases. We are at a point where we have set up a pipeline with multiple activities(5) and we want to make sure that if any fail, none will be executed, i.e. a transaction. Is this something we can do with this technology?

    In extend, we noticed that activities are not executed in order but rather whatever comes first. Is this correct? We hace setup  an execution policy of NewestFirst. Did we maybe misunderstand what this entrails?



    Tuesday, February 7, 2017 7:00 PM


All replies

  • For clarification: Are you asking if it's possible to roll back any completed Activities in a pipeline should any of the other currently executing in the pipeline fail?
    Tuesday, February 7, 2017 9:33 PM
  • Hi Martinos,

    There is no any like transaction functionality in ADF. You can chain your activities, use dependencies between them. If one activity fails all depended activities fail too. But, you can't rollback actions in activity that was finished successfully before failed activity. Here you can read more about chaining activity https://docs.microsoft.com/en-us/azure/data-factory/data-factory-create-pipelines#chaining-activities

    NewestFirst setting means that your newest data will be read first of all and it doesn't make any affect to activity execution order.

    Sergiy Lunyakin (Data Platfrom MVP)

    Tuesday, February 7, 2017 10:47 PM
  • I am facing same situation where  I am loading the data using For each and Copy data. I am using Jason parameter to dynamically load 12 files from 12 procedures. I need the implementation in such a way that if any of the file fails then rest all should get rolled back. I am not sure why adf missing such important feature. Please suggest


    Wednesday, June 3, 2020 1:19 PM