none
getting events while an upload to storage is happening

    Question

  • I've got code like the following and want to know if there is a way I can get events (feedback) as a large blob is pushed.  That is, if the blog takes a long time to go up, I want to have notifications as it's going up.

    var sasCreds = new StorageCredentialsSharedAccessSignature(_sasContainerString);
    
       CloudBlob blob = new CloudBlobClient(_storageUrl, sasCreds)
    
        .GetBlobReference(string.Format("{0}/{1}", _targetContainer, fileData.Name));
    
       try
    
       {
    
        blob.FetchAttributes();
    
       }
    
       catch (StorageClientException e)
    
       {
    
        // Someone may have deleted the blob in the mean time
    
        if (e.ErrorCode == StorageErrorCode.BlobNotFound)
    
        {
    
         throw new ApplicationException("Concurrency Violation", e);
    
        }
    
        throw;
    
       }
    
       BlobProperties blobProperties = blob.Properties;
    
    
    
       //
    
       // For directories create an empty data stream
    
       //
    
       if (dataStream == null)
    
       {
    
        dataStream = new MemoryStream();
    
       }
    
    
    
       SetupMetadata(blob.Metadata, fileData, relativePath);
    
       blobProperties.ContentType = LookupMimeType(Path.GetExtension(fileData.Name));
    
    
    
       // Specify an optimistic concurrency check to prevent races with other endpoints syncing at the same time.
    
       var opts = new BlobRequestOptions
    
           {
    
            AccessCondition = AccessCondition.IfNotModifiedSince(expectedLastModified)
    
           };
    
    
    
       try
    
       {
    
        blob.UploadFromStream(dataStream, opts);
    
       }
    
    

    Peter Kellner http://peterkellner.net Microsoft MVP • ASPInsider
    • Edited by Peter KellnerMVP Sunday, October 17, 2010 8:48 PM I can't spell for sh..
    Friday, October 15, 2010 9:24 PM

Answers

  • Hi Peter,

    Sorry for the delay but has this issue been resolved? We are looking at improving the client library to provide more information. Meanwhile, let us know if the above issue has been resolved. If not, we can work with you on this.

    I would expect any blob > 32MB to be uploaded using block upload. The later versions of storage client library allows you to set the threshold to use blocks rather than a single upload (see here). One thing to take care is the timeout. The client library always uses 90s timeout which may not be sufficient for larger sized blobs. You would have to change the timeout based on the blob size.

    Thanks,

    Jai

    Sunday, March 27, 2011 2:29 AM

All replies

  • With a large blob it might be worth handling the upload of the individual blocks of the blob using the methods provided in CloudBlockBlob. Block-level operations are supported by shared access signatures. Doing this would allow you a much better visibility into the upload status of the blob and allow you to do parallel block uploads.

    Friday, October 15, 2010 10:07 PM
    Answerer
  • long term, I may do that, but short term, I'm wondering if there are any events I can subscribe to that would let me see the data.
    Peter Kellner http://peterkellner.net Microsoft MVP • ASPInsider
    Sunday, October 17, 2010 8:03 PM
  • You might try looking at the CloudBlobClient.ResponseReceived event. I don't actually know if it does what you want but it is documented as: Occurs when a response is received from the server. If you are lucky the event could be raised each time a block is uploaded (transparently) as part of your blob upload.
    Sunday, October 17, 2010 8:23 PM
    Answerer
  • Thanks Neil,

    I can't tell you exactly why I'm using

    CloudBlob blob = new CloudBlobClient(_storageUrl, sasCreds)

    instead of CloudBlobClient, but I do know if I change to CloudBlobClient (instead of CloudBlob), I lose my properties reference and the UploadFromStream method.

    ResponseReceived is only on CloudBlobClient.   :(


    Peter Kellner http://peterkellner.net Microsoft MVP • ASPInsider
    Sunday, October 17, 2010 8:50 PM
  • Peter -

    You can continue using CloudBlob precisely as you are doing but handle the event on the CloudBlobClient used by the CloudBlob to communicate with Azure Storage. I have never tried using this event but it should only take a few minutes to try it out and see if you get useful information.

    Sunday, October 17, 2010 9:04 PM
    Answerer
  • Neil,

    I think I'm missing something here. My syntax:

    CloudBlob blob = new CloudBlobClient(_storageUrl, sasCreds)
            .GetBlobReference(string.Format("{0}/{1}", _targetContainer, fileData.Name));
    
    Never gives me access to a CloudBlobClient that I can hang a delegate off of.  Can you suggest some syntax ( += ) that gives me a delegate?

     


    Peter Kellner http://peterkellner.net Microsoft MVP • ASPInsider
    Sunday, October 17, 2010 10:47 PM
  • Peter -

    You are actually newing up a CloudBlobClient on which you invoke GetBlobReference() - but then you give up a local reference to. You can save the CloudBlobClient object to a variable and then associate a handler to it as follows:

    CloudBlobClient cloudBlobClient = new CloudBlobClient(_storageUrl, sasCreds);
    CloudBlob blob = cloudBlobClient.GetBlobReference(string.Format("{0}/{1}", _targetContainer, fileData.Name));
    
    cloudBlobClient.ResponseReceived += ResponseReceivedHandler;
    
    Note that I have never used this event so it might not give you what you want but it should be easy to try it out.
    Monday, October 18, 2010 2:10 AM
    Answerer
  • ResponseRecieved does seem to be doing what I want, but I can't figure out anything in the ResponseReceivedEventArgs that gives me what is going on (how many bytes transfered, how many to go, etc.

    Any clue how to get the data out of that I need?


    Peter Kellner http://peterkellner.net Microsoft MVP • ASPInsider
    Tuesday, October 19, 2010 12:33 AM
  • Peter -

    I think you can use the RequestHeaders["Content-Length"] and the StatusCode from the ResponseReceivedEventArgs to get what you want. On creating a new blob the StatusCode is Created - as it is for a successful upload of blocks and putting the block list.

    Here is some sample code that uploads a simple blob using a shared access signature. I would not be surprised if the code code be improved:

    static public void WriteBlobWithSharedAccessSignature(CloudBlobClient cloudBlobClient, String containerName, String blobName)
    {
    	SharedAccessPolicy sharedAccessPolicy = new SharedAccessPolicy();
    	sharedAccessPolicy.Permissions = SharedAccessPermissions.Write;
    	sharedAccessPolicy.SharedAccessStartTime = DateTime.UtcNow.AddMinutes(-10);
    	sharedAccessPolicy.SharedAccessExpiryTime = DateTime.UtcNow.AddMinutes(40);
    
    	CloudBlobContainer cloudBlobContainer = new CloudBlobContainer(containerName, cloudBlobClient);
    	CloudBlockBlob cloudBlockBlob = cloudBlobContainer.GetBlockBlobReference(blobName);
    	String sharedAccessSignature = cloudBlockBlob.GetSharedAccessSignature(sharedAccessPolicy);
    
    	// Only the shared access signature used after this point.
    	String blobUri = String.Format("{0}{1}/{2}", AzureStorageConstants.BlobEndPoint, containerName, blobName);
    
    	CloudBlobClient sasCloudBlobClient = new CloudBlobClient(AzureStorageConstants.BlobEndPoint, new StorageCredentialsSharedAccessSignature(sharedAccessSignature));
    	sasCloudBlobClient.ResponseReceived += ResponseReceived;
    
    	CloudBlockBlob sasCloudBlockBlob = sasCloudBlobClient.GetBlockBlobReference(blobUri);
    	sasCloudBlockBlob.UploadText("The Dark End of the Street");
    }
    
    static private void ResponseReceived(object sender, ResponseReceivedEventArgs e)
    {
    	String contentLength = e.RequestHeaders["Content-Length"];
    	HttpStatusCode statusCode = e.StatusCode;
    }
    
    
    Tuesday, October 19, 2010 1:51 AM
    Answerer
  • I'm seeing behavior that is strange and not what I'm looking for. If I have something under 100MB's, it seems to send it all at once. Over, strange things happen.

    for example, if I put a 500MB file to transfer. It seems to run for a very long time, then gives me a statuscode:  System.Net.HttpStatusCode.PreconditionFailed

    Smaller file, it sometimes breaks it up on 4MB chunks.  If I specify the blocksize, it first seems to push the full file, gives a System.Net.HttpStatusCode.PreconditionFailed, then starts doing it in chunks.

    I wish there were more doc'. I'm not happy about guessing behavior in a production product.

     


    Peter Kellner http://peterkellner.net Microsoft MVP • ASPInsider
    Tuesday, October 19, 2010 11:12 PM
  • Hi Peter,

    Sorry for the delay but has this issue been resolved? We are looking at improving the client library to provide more information. Meanwhile, let us know if the above issue has been resolved. If not, we can work with you on this.

    I would expect any blob > 32MB to be uploaded using block upload. The later versions of storage client library allows you to set the threshold to use blocks rather than a single upload (see here). One thing to take care is the timeout. The client library always uses 90s timeout which may not be sufficient for larger sized blobs. You would have to change the timeout based on the blob size.

    Thanks,

    Jai

    Sunday, March 27, 2011 2:29 AM