none
Copying multiple tables from a SQL Server to Blob Storage RRS feed

  • Question

  • Hi,

    I've been able to successfully transfer a single table from an SQL server to blob storage.

    I would now like to be able to transfer all the tables in a database from SQL to blob storage. While defining my dataset for this, I noticed that there wasn't a way to define all the tables in the same JSON file.

    Does this mean that I have to create a new source file for each table?

    Thanks, 

    Tuesday, November 28, 2017 12:35 AM

Answers

  • In ADF V2 you can use the combination of ForEach activity and parameterization to loop through all tables in the database and copy them either sequentially or in parallel (up to 20 concurrent executions).  You do not need to define a dataset plus copy activity for each table.  Please take a look at this tutorial with step-by-step explanation on how to achieve this: https://docs.microsoft.com/en-us/azure/data-factory/tutorial-bulk-copy
    • Marked as answer by Cheese Pudding Wednesday, November 29, 2017 6:44 AM
    Tuesday, November 28, 2017 3:56 AM
    Moderator

All replies

  • In ADF V2 you can use the combination of ForEach activity and parameterization to loop through all tables in the database and copy them either sequentially or in parallel (up to 20 concurrent executions).  You do not need to define a dataset plus copy activity for each table.  Please take a look at this tutorial with step-by-step explanation on how to achieve this: https://docs.microsoft.com/en-us/azure/data-factory/tutorial-bulk-copy
    • Marked as answer by Cheese Pudding Wednesday, November 29, 2017 6:44 AM
    Tuesday, November 28, 2017 3:56 AM
    Moderator
  • Hi, this answered my question. However, the blob names are the run IDs of the copy operation that made it. 

    How do I change this so that the blob's name is the table it came from?

    {
            name: "IterateAndCopySQLTables",
            properties: {
                activities: [
                    {
                        name: "IterateSQLTables",
                        type: "ForEach",
                        typeProperties: {
                            isSequential: "false",
                            items: {
                                value: "@pipeline().parameters.tableList",
                                type: "Expression",
                            },
                            activities: [
                                {
                                    name: "CopyData",
                                    description: "Copy data from SQL database to Azure Blob Storage",
                                    type: "Copy",
                                    inputs: [
                                        {
                                            referenceName: genericTableDataset,
                                            type: "DatasetReference",
                                        },
                                    ],
                                    outputs: [
                                        {
                                            referenceName: blobStorageServiceDataset,
                                            type: "DatasetReference",
                                        },
                                    ],
                                    typeProperties: {
                                        source: {
                                            type: "SqlSource",
                                            sqlReaderQuery: "SELECT * FROM dbo.[@{item().name}]",
                                        },
                                        sink: {
                                            type: "BlobSink",
                                        },
                                    },
                                },
                            ],
                        },
                    },
                ],
                parameters: {
                    tableList: {
                        type: "Object",
                    },
                },
            },
        }

    Thanks,

    Wednesday, November 29, 2017 6:44 AM