locked
Questions/trouble Data Management Gateway RRS feed

  • Question

  • Hi All,

    We have some challenges using Data Management Gateway when moving data from on premise to a SQL Azure using Data Factory. In our system we have a database of around 600 tables. We want to move the most part of this database to the cloud using data factory. Therefor we want to use Data Management Gateway and Data Factory. This means that we would need to build a lot of pipelines and activities to move this data on a timely basis. Is Data Factory and Data Management Gateway the right way to go for this or are there better/other alternatives?

    We also have a D(ev)T(est)A(cceptance)P(roduction) environment but the Azure Subscriptions are seperated in such a way that DT exist in one subscription (we seperate by resource group). This means that if we connect Data Mangement Gateway we can only connect once to the Datafactory as we can enter only one key in the data management gateway. When we want to use the data management gateway in this environment we have to install it twice. This is not really necessary as it is possible to send the connectionstring  through data management gateway to onpremise but we have several different data factories and different keys. Is there a solution to this so that we can use multiple keys in one datagateway installation? Or is it possible to install multiple datagateway on one virtual machines?

    • Edited by PSCNed Monday, November 28, 2016 2:27 PM
    Monday, November 28, 2016 9:16 AM

Answers

  • There's no way to create one activity to copy 600 tables in Data Factory so far. However, by using Copy Wizard, you could list all tables and select some of them (at most 100 in one time) and Copy Wizard will copy one pipeline with 100 activities. So you may finish them by repeating several times.

    For the first data population, there are more things need to be considered. Data Factory will not create table if it doesn't exist in target database, so you'd better export all table definitions and import into SQL Azure. If there are foreign key, you need to update the activities to guarantee the dependency.

    If your on-prem data store is SQL server, you may consider to leverage SQL Server Replication.

    Regarding the gateway sharing, gateway is tightly bundle with a factory for now. There's no plan to support multiple keys on one gateway, but a sharable gateway among data factories is being actively designed, so that customer could share the gateway to different subscription/factories.

    Tuesday, December 6, 2016 10:00 AM