Is it possible to re-run a slice as soon as new file is available on storage RRS feed

  • Question

  • Hi

    I have succesfullt created and run my pipelines with good results.

    But my problem is that I need to rerun my slices whenever I have new data or different algorithm for parsing. So my question is if it is possible to run the custom .net activity each time a new file is saved on azure blob storage.

    I now I can define slices and so on but whenever the last slice window is older than today then pipeline stops processeing. I need it to keep checking my blob without a specific endtime.

    In other words is it possible to leave the end time in pipeline definition empty to force the pipeline to never end:

    "end": "2015-05-22T00:00:00Z",

    Tuesday, June 23, 2015 1:09 PM


  • Hi Gokhan,

    Thanks for using ADF.

    Can the new blob files that you want to process follow a date-based naming convention? If so, you could define your blob file names such that each represents an individual slice (in the future) and have a validation policy that checks the existence of the file. You can choose a frequency that matches the frequency of your file updates.

    If you set an active period with an end time in the future, then all these new files will be processed by your custom activity.

    Here are some pointers to

    I hope this helps.

    Please let me know if you have additional questions / this approach is not feasible for you.

    Monday, June 29, 2015 4:35 PM