none
Error while loading huge files to Data Lake

    Question

  • I have lot of data in zip files. Each file is around 20 GB.

    These files are located on prem and I have gateway setup which can access this location.

    My intent is to do a simple copy of these zip files to Azure Data Lake Store using Data Factory.

    Pipeline runs for a while till it fails with this error. I am not sure what this error mean. Can anyone help?

    Copy activity encountered a user error at Sink side: GatewayNodeName=S7-NCHIRALA,ErrorCode=UserErrorAdlsFileWriteFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=Writing to 'AzureDataLakeStore' failed. 'One or more errors occurred.',Source=Microsoft.DataTransfer.ClientLibrary,''Type=System.Net.WebException,Message=The remote server returned an error: (400) Bad Request.,Source=System,'.

    Tuesday, May 16, 2017 7:47 PM

All replies

  • If you have specified the target ADLS file name as '*.zip', then you might get this error as '*' is not a supported character for a final file name in ADLS.

    Can you please try by specifying a different name.

    Thanks.

    Wednesday, May 24, 2017 11:53 AM