Azure Data Factory Limitations RRS feed

  • Question



    We use SQL DW as our Reporting Database. We load data to SQL DW from HDFS files using ADF pipelines.

    1. Does ADF pipeline have the capability to read meta data table accordingly set up datasets, source file path and target tables? If yes, does pipelines support multiple level of orchestration. Heard that only copy/stored procedure activity supported.

    2. Can we have stored procedure activity and copy activity together for a pipeline?

    Please clarify


    Peer Md.

    Wednesday, November 2, 2016 2:46 PM

All replies

  • Hi Mohamed, 

    Thanks for being interested in ADF. Will be glad to help clarify the questions. 

    #1. Do you mean whether ADF support read from a manifest file and deploy the datasets accordingly? Is there any idea how will the meta data table looks like? Are you meaning HCAT table? Could you please elaborate more on the "multiple level orchestration" means? In terms of activity type, ADF actually support a lot of different types besides copy/sp, for example, hive, pig, usql, etc. 

    #2. Yes. A typical scenario is have one copy activity to load data from File or HDFS in your case into SQL Azure DB or SQL DW and have a 2nd stored procedure activity in the same pipeline to do some processing against the data. Feel free to let us know if you had any trouble when doing this. 


    Thursday, November 3, 2016 3:40 PM