locked
web api 2 REST API uploading stream to blob storage without temp file? RRS feed

  • Question

  • Is it possible to use the request content stream of a multipart message as the source for the uploadstreamasyn method on blob storage?

    I'm trying to do something like this:

                   

    awaitblockBlob.UploadFromStreamAsync(requestMessage.Content.ReadAsStreamAsync().Result);

    It works for smaller files (~212MB) but when I try running against a larger file (~2.5GB), I get a http bad gateway from my service.   On the client that I'm sending from, I'm using this too:

    _client.DefaultRequestHeaders.TransferEncodingChunked =

    true;

    Wednesday, March 4, 2015 1:23 AM

Answers

  • So you do not want the client to upload to Azure Storage directly? One possible problem with that approach is scalability. Will your service be able to handle all the network traffic if there are multiple uploads happening?

    Instead, would it be possible to expose an API to the client that they call to notify your service after they upload the blob directly to the Blob service?

    Wednesday, March 11, 2015 6:09 PM

All replies

  • Hi sir,

    According to the error message, I am hardly to got the root cause of this issue. I suggest you can use Fiddler to trace this request and find the error message. Generally, we could use this method to upload larger file. I recommend you could refer to Kevin's blog about how to asynchronous parallel upload file:

    http://blogs.msdn.com/b/kwill/archive/2013/03/06/asynchronous-parallel-block-blob-transfers-with-progress-change-notification-2-0.aspx

    Hope this helps.

    Regards,

    Will


    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click HERE to participate the survey.

    Wednesday, March 4, 2015 9:53 AM
  • Just to clarify the question; you have an ASP.NET service and you want users to upload files to this service, which will stream that content to Azure Storage as a block blob. Please correct me if I am wrong.

    If my understanding is correct, the recommended best practice for this scenario is to use Shared Access Signatures (SAS) to let the user to upload directly to Azure Storage. In this solution, your service will only handle generating the SAS tokens and sending them to the client.

    For more information on using SAS, please refer to Shared Access Signatures, Part 1: Understanding the SAS Model

    Wednesday, March 4, 2015 7:13 PM
  • Yes, this is what we want to achieve; an API exposed by our REST service that will upload files to specific containers of blob storage.  Additionally, we want to be able to write custom metadata properties with the blobs in addition to kicking off certain workflows once the  blobs have been updated (plan is to post messages to SB Queue for this triggering mechanism).
    Wednesday, March 4, 2015 11:52 PM
  • So you do not want the client to upload to Azure Storage directly? One possible problem with that approach is scalability. Will your service be able to handle all the network traffic if there are multiple uploads happening?

    Instead, would it be possible to expose an API to the client that they call to notify your service after they upload the blob directly to the Blob service?

    Wednesday, March 11, 2015 6:09 PM