Uploading file to ADLS Gen2 file system with out using SSIS RRS feed

  • Question

  • Could you recommend the standard approach to upload files from on premise to Azure Data Lake Gen2 file storage with out using SSIS please.



    Monday, August 19, 2019 9:40 PM


  • Hi San,

    There are different ways to upload files from on-premise file system to Azure Data Lake Gen2.

    • Azure Storage Explorer
    • Azcopy
    • Azure Data Factory

    Azure Storage Explorer is a standalone app that enables you to easily work with Azure Storage data on Windows, macOS, and Linux. In this article, you learn several ways of connecting to and managing your Azure storage accounts.

    Use only the latest version of AzCopy (AzCopy v10):

    Option1: Use Azure AD

    Download a single file using OAuth authentication. Please use 'azcopy login' command first if you aren't logged in yet:

    azcopy cp "https://[account].blob.core.windows.net/[container]/[path/to/blob]" "/path/to/file.txt"

    Upload a single file with a SAS:

    azcopy cp "/path/to/file.txt" https://[account].blob.core.windows.net/[container]/[path/to/blob]?[SAS]

    Option 2: Use a SAS token

    You can append a SAS token to each source or destination URL that use in your AzCopy commands.

    This example command recursively copies data from a local directory to a blob container. A fictitious SAS token is appended to the end of the of the container URL.

    azcopy cp "C:\local\path" "https://account.blob.core.windows.net/mycontainer1/?sv=2018-03-28&ss=bjqt&srt=sco&sp=rwddgcup&se=2019-05-01T05:01:17Z&st=2019-04 -30T21:01:17Z&spr=https&sig=MGCXiyEzbtttkr3ewJIh2AR8KrghSy1DGM9ovN734bQF4%3D" --recursive=true

    Azure Data Factory: You can create Azure Data Factory copy activity: Source as File System and Destination as Azure Data Lake Gen2.

    For more details, refer “Load data into ADLS Gen2 with ADF”.

    Hope this helps.

    Tuesday, August 20, 2019 5:11 AM