Azure Data Lake Store: error copying files to the data lake store


  • I am trying to move our files from HDFS (Data Management Gateway in Azure Virtual Machine) to Azure Data LakeStore. So I created a Data Factory (with linked services, data sets and pipeline). Pipline is created with Copy Action.

    One of the linked service we choose "Azure Data Lake Store" and we authorized configuration and the finally configuration is like below

        "name": "AzureDataLakeStoreLinkedService",
        "properties": {
            "hubName": "xxxxxxxx_hub",
            "type": "AzureDataLakeStore",
            "typeProperties": {
                "dataLakeStoreUri": "",
                "authorization": "**********",
                "sessionId": "**********",
                "subscriptionId": "xxxxxxxxxxxxxx",
                "resourceGroupName": "xxxxxxxxxxxx"

    to run the programming of pipeline displays the following error: Copy activity encountered a user error:

    ErrorCode=UserErrorHdfsUnexpectedFileIO,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=The request to 'HDFS' failed due to unexpected error,
    Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=Unable to connect to the remote server,Source=System,
    ''Type=System.Net.Sockets.SocketException,Message=No connection could be made because the target machine actively refused it xxxxxxxxxxxxxxxxxxxxxx,Source=System,'.

    I have set the access groups, but this has not worked.

    Tuesday, July 5, 2016 7:54 PM

All replies

  • Just to elimiate simple checks

    1) From the Data Management Gateway - Are you able to connect to the on-premise instance?

    2) Does Data Management Gateway shows as Registered and Connected DMG and shows as Online on Azure Portal under the Data Factory?

    Let us know.


    Sushant Rane

    Tuesday, July 5, 2016 10:48 PM
  • 1. I managed to connect to a cloud database from the data gateway.
    2. yes, registered and logged.


    Wednesday, July 6, 2016 12:59 PM
  • Hi,

    Could you show us the dataset, linked service definition for your HDFS data store? Since we're using WebHDFS protocol to access the HDFS files so there are some ports needed to be opened. A quick way you can try is use a browser and enter http://xxxx:50070/webhdfs/v1?op=gethomedirectory to see if it works. 50070 is the typical port that WebHDFS protocol uses to access files and directories.

    Furthermore, can you provide us the run id in ADF portal so that we can take a further look? Looking forwarded to your reply.



    Keep involved!

    Monday, July 18, 2016 5:31 AM