locked
HttpClient and IInputStream (partial request)

    Question

  • Hi,

    I'm trying to implement an http upload method which encrypts the data before uploading it. I could make it work if I encrypt all the file before and then upload the file, but I'd like to perform the encryption in parallel while the data is being sent.

    I need the HttpClient to request the http-request body in chunks, so I can encrypt the second chunk while the first is being sent.

    Well I tried using HttpStreamContent as the request content, and implemented a custom IInputStream.

        public class CryptoStream : IInputStream
        {
    
            public IAsyncOperationWithProgress<IBuffer, uint> ReadAsync(IBuffer buffer, uint count, InputStreamOptions options)
            {
                throw new NotImplementedException();
            }
    
            public void Dispose()
            {
                throw new NotImplementedException();
            }
        }
    

    the problem is that the method ReadAsync is called only one time with the parameter "options == InputStreamOptions.ReadAhead" , and I need this method to be called multiple times to return encrypted chunks. Is there any property in HttpClient or HttpStreamContent to indicate the request content should be read in chunks?

    Thanks


    Alvaro Rivoir

    Monday, December 08, 2014 12:42 AM

Answers

  • ReadAsync() with InputStreamOptions.ReadAhead is called multiple time in the stream you pass to HttpClient.

    Each time ReadAsync() is called, you only need to provide the bytes requested. Generally it is 65536 bytes.

    Look at this example:

    HttpClient httpClient = new HttpClient();
    var response = await httpClient.PostAsync(
        new Uri("http://localhost"),
        new HttpStreamContent(new SlowStream()));

    Where SlowStream has a fake 1 second delay and is implemented like this:

    public class SlowStream : IInputStream
    {
        private uint iterations = 0;
    
        public IAsyncOperationWithProgress<IBuffer, uint> ReadAsync(
            IBuffer buffer,
            uint count,
    InputStreamOptions options) { return AsyncInfo.Run<IBuffer, uint>(async (cancellationToken, progress) => { // Introduce an artificial delay. await Task.Delay(1000); if (iterations++ > 5) { // Return a partially filled buffer to signal stream is over. buffer.Length = 1; return buffer; } byte[] array = new byte[buffer.Capacity]; for (int i = 0; i < array.Length; i++) { array[i] = 120; } return array.AsBuffer(); }); } public void Dispose() { } }

    Once you return the first 65536 bytes, they are sent immediately to the server.

    Notice that this stream only implements IInputStream, and Content-Length is unknown at the beginning of the request. So, the request is Content-Encoding: chunked.

    If you have problems with chunked requets, make your stream implement IRandomAccessStream interface too.

    • Marked as answer by Alvaro Rivoir Saturday, December 13, 2014 5:00 PM
    Wednesday, December 10, 2014 6:37 AM

All replies

  • Hi Alvaro,

    Er, not sure if we can do this, the only option of InputStreamOptions is ReadAhead and Partial, try partial instead?

    If you load the full stream at beginning and separate them into chunks before calling encrypt on them, will this idea difficult to implement?

    Is there any property in HttpClient or HttpStreamContent to indicate the request content should be read in chunks? I did not find any property can do this.

    --James


    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click HERE to participate the survey.

    Monday, December 08, 2014 8:30 AM
    Moderator
  • Hi James,

    Thanks for your reply.

    I can't control the InputStreamOptions, this is controlled by HttpClient, that's why I asked about a property to change this.

    Loading the full stream wouldn't help, actually I'm encrypting the full stream before sending any byte, because this is the only alternative, but the time involved in the whole operation is the time to encrypt + the time to upload, and considering the encryption uses the cpu heavily and the "post" uses the network connection I could do both things in parallel, If I could encrypt at the same time I'm sending the previous chunk I could reduce the time drastically.

    Regards


    Alvaro Rivoir

    Monday, December 08, 2014 7:56 PM
  • ReadAsync() with InputStreamOptions.ReadAhead is called multiple time in the stream you pass to HttpClient.

    Each time ReadAsync() is called, you only need to provide the bytes requested. Generally it is 65536 bytes.

    Look at this example:

    HttpClient httpClient = new HttpClient();
    var response = await httpClient.PostAsync(
        new Uri("http://localhost"),
        new HttpStreamContent(new SlowStream()));

    Where SlowStream has a fake 1 second delay and is implemented like this:

    public class SlowStream : IInputStream
    {
        private uint iterations = 0;
    
        public IAsyncOperationWithProgress<IBuffer, uint> ReadAsync(
            IBuffer buffer,
            uint count,
    InputStreamOptions options) { return AsyncInfo.Run<IBuffer, uint>(async (cancellationToken, progress) => { // Introduce an artificial delay. await Task.Delay(1000); if (iterations++ > 5) { // Return a partially filled buffer to signal stream is over. buffer.Length = 1; return buffer; } byte[] array = new byte[buffer.Capacity]; for (int i = 0; i < array.Length; i++) { array[i] = 120; } return array.AsBuffer(); }); } public void Dispose() { } }

    Once you return the first 65536 bytes, they are sent immediately to the server.

    Notice that this stream only implements IInputStream, and Content-Length is unknown at the beginning of the request. So, the request is Content-Encoding: chunked.

    If you have problems with chunked requets, make your stream implement IRandomAccessStream interface too.

    • Marked as answer by Alvaro Rivoir Saturday, December 13, 2014 5:00 PM
    Wednesday, December 10, 2014 6:37 AM