locked
How to bulk copy tables from one Azure SQL DB to another Azure SQL DB RRS feed

  • Question

  • Hi team,

    I tried to bulk copy some tables from one Azure SQL DB to another.

    The destination DB is empty and nothing there.

    I followed the steps with this link:

    https://docs.microsoft.com/en-us/azure/data-factory/tutorial-bulk-copy-portal

    The example provided is slightly different as the destination is SQL DW rather than SQL DB, the rest are the same.

    During the process it failed and the following error code returned:

    { "errorCode": "2200", "message": "ErrorCode=UserErrorInvalidColumnName,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Column 'id' does not exist in the table 'dbo.test2', ServerName: 'mstbisql01.database.windows.net', DatabaseName: 'TESTDB'.,Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.InvalidOperationException,Message=Cannot access destination table '[dbo].[test2]'.,Source=System.Data,''Type=System.Data.SqlClient.SqlException,Message=Invalid object name 'dbo.test2'.,Source=.Net SqlClient Data Provider,SqlErrorNumber=208,Class=16,ErrorCode=-2146232060,State=1,Errors=[{Class=16,Number=208,State=1,Message=Invalid object name 'dbo.test2'.,},],'", "failureType": "UserError", "target": "CopyData" }

    However, if I change the destination to Azure Blob is works and separate files can be created and data copied, which means the dynamic setting during the steps are correct.

    My guess would be the destination database does not have the metadata (as per source tables) so the process failed.

    Can the data be copied to database without specifying the metadata?

    If that is not the case, how can I achieve this?

    Thanks so much for this.

    Kind regards,

    Tom Sun

    Thursday, May 2, 2019 11:11 AM

All replies

  • Hi,

    Is there master database there in other SQL server?

    Thursday, May 2, 2019 11:39 AM
  • Hi Tom,

    Can you please go through below same kind of thread ,it may helps you.

    https://stackoverflow.com/questions/6572754/sql-azure-copy-table-between-databases


    Please click "Mark as Answer" if it solved your issue and/or "Vote as helpful" if it helped. This can be beneficial to other community members reading this thread.

    Thursday, May 2, 2019 11:49 AM
  • Yes it does have master DB for the destination database.
    Thursday, May 2, 2019 11:50 AM
  • Sorry I give the wrong information at the first time. It does have master db with the destination server.
    Thursday, May 2, 2019 11:55 AM
  • The reason I am using ADF to do the job is the number of tables might change so I want a dynamic way to do the job. So get the script and paste it into SSMS then run is not a solution for this case.
    Thursday, May 2, 2019 11:58 AM
  • Hi Tom,

    The problem is that the column names are not the same in the source and sink tables. In the copy activity, you need to do the following :

    • add the source and sink (verify that the dataset connections for both are working by using "test connection" button).
    • go to the mappings and click "import schemas" - this will import both the source and sink schemas.
    • You can use the mappings to configure column mappings between both the tables.

    Hope this helps.

    Monday, May 6, 2019 10:05 AM
  • hi master,

    Thanks for the answer however I do not think that is the reason to cause the issue.

    'Due to there are several tables need to be copied and each table has different metadata, I do not think the mapping activity is going to work.

    Due to the sink dataset has an parameter to specify table name during for each activity, if I click the import schema button it always coming back error as it can not be determined the value of connection string until the loop is activated.

    Kind regards,

    Tom Sun

    Monday, May 6, 2019 11:24 AM
  • Hi Tom,

    Sorry I missed the multiple tables part. In that case, you can use a custom activity to achieve your desired result. To read more about custom activities, please refer this doc.

    Monday, May 6, 2019 11:45 AM
  • Thanks for the answer.

    I just did another try of this.

    If I pre-define the metadata in the destination database for each of the tables, the pipeline worked and the data can be copied.

    So this made me figure out the issue actually comes from the metadata of table.

    Is there a way to resolve the dynamic column mapping on the fly?

    Kind regards,

    Tom Sun

    Monday, May 6, 2019 11:56 AM
  • I have read the documents listed and it seems the following is the JSON to resolve the issue:

    "activities":[ { "name": "CopyActivityTemplate", "type": "Copy", "inputs": [ { "referenceName": "<source dataset name>", "type": "DatasetReference" } ], "outputs": [ { "referenceName": "<sink dataset name>", "type": "DatasetReference" } ], "typeProperties": { "source": { "type": "<source type>", <properties> }, "sink": { "type": "<sink type>" <properties> }, "translator": { "type": "TabularTranslator", "columnMappings": "<column mapping>" }, "dataIntegrationUnits": <number>, "parallelCopies": <number>, "enableStaging": true/false, "stagingSettings": { <properties> }, "enableSkipIncompatibleRow": true/false, "redirectIncompatibleRowSettings": { <properties> } } } ]

    The issue next is how can I specify the column mapping in this JSON? I am not familiar with this so could you please help on it?

    Monday, May 6, 2019 11:58 AM
  • Hi Tom,

    Please refer this thread to get insights into dynamic mapping and see if it helps.

    Tuesday, May 7, 2019 7:14 AM
  • HI Chirag,

    Sorry for the late reply.

    I tried that trick in the post but it only works with metadata defined in the destination database for the tables.

    So if you want to copy the data and create the table on the fly, it still failed.

    So the question become this, is the mandatory to define table metadata first in copy activity?

    Could you please provide more advice on that?

    Sunday, May 12, 2019 12:46 PM
  • Hi Tom,

    Sorry for the late reply. For a deeper investigation and immediate assistance on this issue, if you have a support plan, please file a support ticket, else please send an email to AzCommunity@Microsoft.com with the below details, we would like to create a 1-time-free support ticket for you on the issue. 

    Thread URL: https://social.msdn.microsoft.com/Forums/en-US/cd46f87a-a70c-4192-a4b0-297e18844089/how-to-bulk-copy-tables-from-one-azure-sql-db-to-another-azure-sql-db?forum=AzureDataFactory
    Subscription ID: <your subscription id>

    Thanks.

    Monday, May 20, 2019 11:07 AM