locked
Logic Apps Azure Data lake Upload file Action-large files are failing to upload with 413 status code RRS feed

  • Question

  • I am trying to upload a file to Azure Data lake using Azure Data lake Upload File action of Logic Apps. It is working fine for small files about 20 MB. But files with 28 MB or greater are failing with Status code 413- request entity too large. Chunking is also enabled in the Upload File Action. Is there any solution for this?

    



    • Edited by PK254126 Tuesday, July 16, 2019 6:51 AM
    Tuesday, July 16, 2019 6:45 AM

Answers

  • Hi,

    Thanks for the response. 

    I have got a workaround. My scenario involves getting file from SharePoint online and uploading to Azure Data Lake. In earlier setup which had the above issue, I was using SharePoint trigger -When a file is created or modified in a folder, which returns file content, to get the file from SharePoint and Datalake Upload File action to upload it to Azure Data Lake. This setup was failing for files larger than 27MB (request entity too large-413) in File Upload Action even when chunking was enabled at File Upload action. After some troubleshooting, I got a workaround which involves using another SharePoint trigger-When a file is created or modified in a folder(properties-only). It returns metadeta instead of file content. After getting metadeta I used Get File Content SharePoint Action to get the file content to upload to Azure Data lake which worked fine.

    Wednesday, July 17, 2019 6:26 AM

All replies

  • Hello PK254126,

    I just tried with the file size 33 MB at my side and it worked fine for me when Allow Chunking is enabled for Upload file action.

    You can upload large file using Data Lake connector as mentioned here.

    It is failing with 413 error when allow chunking is disabled.


    Tuesday, July 16, 2019 9:59 AM
  • Hi,

    Thanks for the response. 

    I have got a workaround. My scenario involves getting file from SharePoint online and uploading to Azure Data Lake. In earlier setup which had the above issue, I was using SharePoint trigger -When a file is created or modified in a folder, which returns file content, to get the file from SharePoint and Datalake Upload File action to upload it to Azure Data Lake. This setup was failing for files larger than 27MB (request entity too large-413) in File Upload Action even when chunking was enabled at File Upload action. After some troubleshooting, I got a workaround which involves using another SharePoint trigger-When a file is created or modified in a folder(properties-only). It returns metadeta instead of file content. After getting metadeta I used Get File Content SharePoint Action to get the file content to upload to Azure Data lake which worked fine.

    Wednesday, July 17, 2019 6:26 AM