Incremental data upload from On-Premise by ADF RRS feed

  • Question

  • Hi Experts,

    We are new to Azure world. We have some queries below and need expert advice 

    We have 6-7 sources on-premise (Oracle ERP, customize and file sources) We need to build EDW (Data Vault) on azure DW 
    and use ADF for data movement services


    1. What is the best approach to move the on-prem data: Stage data first at Azure blob and then to azure DW or directly to staging tables in Azure dw. I know Microsoft recommends to stage it first at Azure blob, but want to understand the pros and cons of both the approaches.

    2. How we can move daily incremental data by ADF to azure blob/Azure dw.

    3. How to remove previous incremental data files from azure blob (in case of incremental load to blob), like we do truncate           load in traditional systems.

    4. How to create master child relationships in ADF piplines, like we do in other ETL tools.

    5. How to handle bad data (rejected rows) in ADF, like in other ETL we transfer the bad data to other tables



    Amit Tomar


    Please mark this as answer if it solved your query

    Please vote this as helpful if it solved your query


    My Blog My Wiki Page

    Wednesday, August 9, 2017 8:37 AM

All replies

  • Hi Amit,

    To connect with  on-premise (Oracle ERP, customize and file sources), you need to install the Azure Data Management Gateway. And we can't use the custom activity with AMG. So I think first you need to create the Input data set, for example in File share, Create the folder , put files, then ADF will copy the data to ADLS,. Now you need to write some script to delete the processed file on folder, so when again, new data will come, new file will be copied in that folder.

    Also once you copied data in ADLS, then you can use Custom ADF activites.

    Monday, August 21, 2017 9:32 AM