locked
cloudblockblob failing to donwload 5 GB data. RRS feed

  • Question

  • Hi all,

    I have created a web application to download an iso file stored at Azure Storage. The size of the iso file is around 5 GB. When I run the application, the application is downloading only 200 MB data rest of the data is not downloaded.

    How can I fix this issue. Please help me.

    The code I am using is:-

     protected void btn_Download_Click(object sender, EventArgs e)
      {

          string downloadfile = ddl_popfile.SelectedItem.ToString();
          AccountFileTransfer = CloudStorageAccount.Parse("DefaultEndpointsProtocol=http;AccountName=" + ACCOUNTNAME + ";AccountKey=" + ACCOUNTKEY);
          if (AccountFileTransfer != null)
          {
              BlobClientFileTransfer = AccountFileTransfer.CreateCloudBlobClient();
              ContainerFileTransfer = BlobClientFileTransfer.GetContainerReference(CONTAINER);
              ContainerFileTransfer.CreateIfNotExist();
          }

        
         
          var blob = ContainerFileTransfer.GetBlockBlobReference(downloadfile);
         
          var sasUrl = blob.Uri.AbsoluteUri;


          CloudBlockBlob blockBlob = new CloudBlockBlob(sasUrl);
          var blobSize = 200 * 1024 * 1024;
          int blockSize = 1024 * 1024;
          Response.Clear();
          Response.ContentType = "APPLICATION/OCTET-STREAM";
          System.String disHeader = "Attachment; Filename=\"" + blockBlob.Name + "\"";
          Response.AppendHeader("Content-Disposition", disHeader);
          for (long offset = 0; offset < blobSize; offset += blockSize)
          {
              using (var blobStream = blockBlob.OpenRead())
              {
                  if ((offset + blockSize) > blobSize)
                      blockSize = (int)(blobSize - offset);
                  byte[] buffer = new byte[blockSize];
                  blobStream.Read(buffer, 0, buffer.Length);
                  Response.BinaryWrite(buffer);
                  Response.Flush();
              }
          }
          Response.End();
      }

    Saturday, January 17, 2015 6:32 AM

All replies

  • Hi,

    I am confuse with the code below.

    for (long offset = 0; offset < blobSize; offset += blockSize)
           {
               using (var blobStream = blockBlob.OpenRead())
               {
                   if ((offset + blockSize) > blobSize)
                       blockSize = (int)(blobSize - offset);
                   byte[] buffer = new byte[blockSize];
                   blobStream.Read(buffer, 0, buffer.Length);
                   Response.BinaryWrite(buffer);
                   Response.Flush();
               }
           }

    1) blobStream.Read(buffer, 0, buffer.Length); seems always read (0, 1024*1024)

    2) every time we read 1MB, if we read the 199M, then 199M+1M=blobSize, after this, offset increase to 200M, and 200M=blobSize, so it seems that this code ( if ((offset + blockSize) > blobSize)  blockSize = (int)(blobSize - offset); ) will not be execute in any condition.

    I test this code on my side, I try to download 80M files, but the download file is 200MB, so, please check your logic.

    Best Regards,

    Jambor


    We are trying to better understand customer views on social support experience, so your participation in this interview project would be greatly appreciated if you have time. Thanks for helping make community forums a great place.
    Click HERE to participate the survey.

    Monday, January 19, 2015 7:08 AM
  • Thanks for pointing me to the code.

    In above code I have hard coded the blob size of 200 MB. I was just testing to download the 200 MB file from Azure Storage. I have divided this blob size in to 1 MB each. So my code is working fine for 200 MB file.

    If you want to download 80 MB file you could use

    var blobSize = 80 * 1024 * 1024; // 80 MB file
    int blockSize = 1024 * 1024; // 1 MB chunk

    First of all I tried to test it for 200 MB file.

    Please guide me if I am doing anything wrong.

    Thanks.

    Monday, January 19, 2015 12:38 PM
  • Hi Vishwajeet,

    Are you still having this problem with downloading large files or was this resolved? From the code, it's not clear that there is any reason for the download to fail (other than the hardcoding to 200MB which would of course restrict the download to 200MB). You shouldn't have issues downloading large blobs typically.

    Thanks

    Wednesday, March 4, 2015 12:39 AM
  • The above code is working fine with file of small size. but when I try to download file of size in GBs in my case 3 GB. I got the error "Connection was closed by remote host". I also checked my internet connection there was only 1 or 2 Request Timeout. Every time I try to download block blob size of 3GB and more I am getting the same error.

    Wednesday, March 4, 2015 6:13 AM
  • Thanks for the reply, we'll take a look...can you post the code you're actually using since the first code snippet seems to have a bug in the offset download as Jamobor pointed out?

    Wednesday, March 4, 2015 6:17 AM