locked
ADF Custom activity to push data from datalake to on premise database using gateway RRS feed

  • Question

  • I've a custom c# script for reading datalake files.Can anyone please let me know how to connect to on premise database using linked service and insert data into prem database using pipeline with a custom .NET activity?

    1.I've set up an on prem gateway.Defined input,output linked services and datasets as following. I need to read data from datalake and insert into onprem oracle database table(using OnPremisesOracleLinkedService) using ADF custom activity.

    i. Output Linked service

    { "name": "OnPremisesOracleLinkedService", "properties": { "description": "", "hubName": "*****hub", "type": "OnPremisesOracle", "typeProperties": { "connectionString": "host=;port=;sid=;user id=*;password=**********", "gatewayName": "testDatagateway", "userName": "****", "password": "**********" } } }

    ii.Output dataset-on premise oracle table

    { "name": "OracleTable", "properties": { "published": false, "type": "OracleTable", "linkedServiceName": "OnpremisesOracleLinkedService", "typeProperties": { "tableName": "Testdata" }, "availability": { "frequency": "Day", "interval": 1 } } }

    2.i.Input Linked service

    { "name": "Test-DL", "properties": { "hubName": "*******hub", "type": "AzureDataLakeStore", "typeProperties": { "dataLakeStoreUri": "https://*****.azuredatalakestore.net/webhdfs/v1", "accountName": "a*******", "authorization": "**********", "sessionId": "**********", "subscriptionId": "b*********************", "resourceGroupName": "testfeed" } } }

    ii. datalakestore input dataset

    { "name": "TestDL-In", "properties": { "published": false, "type": "AzureDataLakeStore", "linkedServiceName": "Test-DL", "typeProperties": { "folderPath": "/040********/" }, "availability": { "frequency": "Hour", "interval": 1 } } }

    Monday, April 3, 2017 6:44 PM

All replies

  • It is not supported to use a Data Management Gateway from a custom activity to access on-premises data sources. Currently, gateway supports only the copy activity and stored procedure activity in Data Factory. So, suggest staging your data from ADLS to azure storage using custom activity and then coping your data from storage table to oracle table using copy activity.
    Tuesday, April 11, 2017 2:53 AM
  • You should be able to copy directly from Data Lake Store to the on-premise oracle table with Data Factory, as long as the schema of the input file and output table are similar. Use the copy wizard in Data Factory to generate the pipeline.
    Tuesday, April 11, 2017 8:27 AM