none
Gen2 REST API - Update limit?

    Question

  • I'm using the Create/Update endpoints to add files to a Data Lake folder. I am able to upload a file of 92M, but I get a 413 response when I try to upload a 111M or 117M file.

    Using the Storage Explorer, I can drop those larger files into the folder.

    1. Could you please tell me what the size limit is for the Upload API endpoint (and/or where that is documented)? 

    2. Also, please let me know if you have any recommended workarounds so that I can use the API to upload these files.

    Thank you.
    Monday, April 1, 2019 1:50 AM

Answers

All replies

  • Hello,

    I’m working with the product team and get back to you when I have more information.

    Tuesday, April 2, 2019 10:27 AM
    Moderator
  • Hello,

    ADLS Gen2 has the same limitation as Block Blobs. For more details regarding block & file sizes (see table here: https://docs.microsoft.com/en-us/azure/storage/common/storage-scalability-targets#azure-blob-storage-scale-targets).

    Where the table refers to ‘Blocks’, this is equivalent to a single ‘Append’ API invocation (the Append API creates a new block) and is therefore limited to 100MB per invocation. However, the write pattern does support calling Append many times per file (even in parallel) to a maximum of 50,000 and then calling ‘Flush’ (equivalent of PutBlockList). This is how the maximum files size of 4.75TB is achieved.

    Hope this helps.

    Wednesday, April 3, 2019 4:42 AM
    Moderator
  • Thank you. Yes, that is what I was looking for.

    Proving a bit painful to get all the positions and character counts correct as I read through my stream and write the data over in separate calls (really, I'm not that bad at math :-)  ), but I'm getting close.


    Wednesday, April 3, 2019 6:50 PM
  • Hello,

    Glad to know that it helped.

    Thursday, April 4, 2019 5:40 AM
    Moderator