none
Huge File upload and download using WCF in Azure

    Question

  • Hi,

    There is a requirement in my Azure application to upload and download files of size upto 1Gb. I would like to know, how to upload and download large files (ie.1GB) to cloud blob storage using WCF?

    Also, is there any limitation of uploading files to azure blob storage?

    Thanks in advance,

    Sujith.


    Monday, May 16, 2011 6:05 AM

Answers

  • Hello Sujith,

    Steve Marx wrote a blog to describe how Put Block operation and Put Block List operation can be used to upload huge files. If you don't want the client to directly connect to the blob storage and want to upload data through WCF service. You may create two service methods to operation Put Block and Put Block List.

    Here is a code sample that I use Stroage Client and WCF service to upload huge files for test purpose. You may refer to it and implement your own one.

    Firstly please create a WCF service (I use a ASP.NET application to host WCF service) with two methods PutBlock and PutBlockList:

        public class Upload : IUpload
        {
            private CloudBlockBlob GetBlobkBlob(string fileName)
            {
                var storageAccount = CloudStorageAccount.Parse("UseDevelopmentStorage=true");
                CloudBlobClient blobStorage = storageAccount.CreateCloudBlobClient();

                CloudBlobContainer container = blobStorage.GetContainerReference("files");
                container.CreateIfNotExist();

                return container.GetBlockBlobReference(fileName);
            }
            public bool PutBlock(string fileName, string blockId, string base64Content)
            {
                byte[] bytes = Convert.FromBase64String(base64Content);

                using (MemoryStream memoryStream = new MemoryStream(bytes))
                {
                    try
                    {
                        CloudBlockBlob blob = GetBlobkBlob(fileName);
                        blob.PutBlock(blockId, memoryStream, null);
                    }
                    catch (Exception)
                    {
                        return false;
                    }
                }
                return true;
            }
            public bool PutBlockList(string fileName, string[] blockIds)
            {
                try
                {
                    CloudBlockBlob blob = GetBlobkBlob(fileName);
                    blob.PutBlockList(blockIds);
                }
                catch (Exception)
                {
                    return false;
                }
                return true;
            }
        }

    To allow requesting large content, please add the maxStringContentLength attribute in config file.

        <system.serviceModel>
            <behaviors>
                <serviceBehaviors>
                    <behavior name="">
                        <serviceMetadata httpGetEnabled="true" />
                        <serviceDebug includeExceptionDetailInFaults="false" />
                    </behavior>
                </serviceBehaviors>
            </behaviors>
            <bindings>
                <basicHttpBinding>
                    <binding>
                        <readerQuotas maxStringContentLength="2147483647" />
                    </binding>
                </basicHttpBinding>
            </bindings>
            <serviceHostingEnvironment multipleSiteBindingsEnabled="true" />
        </system.serviceModel>

    And the client code to call that WCF service is:

            static void Main(string[] args)
            {
                int bufferSize = 40 * 1024; // 40 KB.
                byte[] bufferBytes = new byte[40960];

                List<string> blockIds = new List<string>();

                string filePath = "D:\\gpcoverlay-2459.zip"; // Please use your own file path.
                string fileName = Path.GetFileName(filePath);
                FileStream fileStream = File.OpenRead(filePath);

                int blockCount = (int)(fileStream.Length / bufferSize) + 1;
                Console.WriteLine("Block count: " + blockCount.ToString());

                for (int i = 0; i < blockCount; i++)
                {
                    fileStream.Read(bufferBytes, 0, bufferSize);
                    string currentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString()));
                    blockIds.Add(currentBlockId);

                    using (ServiceReference1.UploadClient client = new ServiceReference1.UploadClient())
                    {
                        if (client.PutBlock(fileName, currentBlockId, Convert.ToBase64String(bufferBytes)))
                        {
                            Console.WriteLine("Block {0} uploaded. {1}%", currentBlockId, ((i + 1) * 100 / blockCount));
                        }
                    }
                }

                using (ServiceReference1.UploadClient client = new ServiceReference1.UploadClient())
                {
                    if (client.PutBlockList(fileName, blockIds.ToArray()))
                    {
                        Console.WriteLine("File {0} uploaded.", fileName);
                    }
                }

                Console.ReadLine();
            }

    After the upload completed, you can check the uploaded file in the "files" blob contains by using storage explorer kind tools. For downloading files, you can use the similar way, but will use CloudBlob.OpenRead Method, Seek Method and Stream.Read Method for downloading bynary data and combine them into a huge file.

    If you need further assistance, please let me know.

    Thanks,


    Wengchao Zeng
    Please mark the replies as answers if they help or unmark if not.
    If you have any feedback about my replies, please contact msdnmg@microsoft.com.
    Microsoft One Code Framework
    • Marked as answer by Sujith.S Wednesday, May 18, 2011 12:43 PM
    Wednesday, May 18, 2011 9:05 AM

All replies

  • Hi Sujith,

    Each storage account allows you to have 100TB of data across all of your blob, tables and queues. The maximum size for a block blob can be 200 GB.

    Block blobs are comprised of blocks, each of which is identified by a block ID. You create or modify a block blob by uploading a set of blocks and committing them by their block IDs. If your uploading or downloading file is large (more than 64MB in size), then you must break it into blocks where each block can be a maximum of 4 MB in size.

    Visit http://msdn.microsoft.com/en-us/library/ee691964.aspx for more information.


    Amit Jain

    http://www.cerebrata.com/

    Monday, May 16, 2011 6:57 AM
  • Hi Amit,

    Thanks for your information about limitations of blob storage.

    I want to know how to upload and download files to/from blob storage using WCF.

    Thanks in advance,

    Sujith


    S.Sujith
    Tuesday, May 17, 2011 4:34 AM
  • Hello Sujith,

    Steve Marx wrote a blog to describe how Put Block operation and Put Block List operation can be used to upload huge files. If you don't want the client to directly connect to the blob storage and want to upload data through WCF service. You may create two service methods to operation Put Block and Put Block List.

    Here is a code sample that I use Stroage Client and WCF service to upload huge files for test purpose. You may refer to it and implement your own one.

    Firstly please create a WCF service (I use a ASP.NET application to host WCF service) with two methods PutBlock and PutBlockList:

        public class Upload : IUpload
        {
            private CloudBlockBlob GetBlobkBlob(string fileName)
            {
                var storageAccount = CloudStorageAccount.Parse("UseDevelopmentStorage=true");
                CloudBlobClient blobStorage = storageAccount.CreateCloudBlobClient();

                CloudBlobContainer container = blobStorage.GetContainerReference("files");
                container.CreateIfNotExist();

                return container.GetBlockBlobReference(fileName);
            }
            public bool PutBlock(string fileName, string blockId, string base64Content)
            {
                byte[] bytes = Convert.FromBase64String(base64Content);

                using (MemoryStream memoryStream = new MemoryStream(bytes))
                {
                    try
                    {
                        CloudBlockBlob blob = GetBlobkBlob(fileName);
                        blob.PutBlock(blockId, memoryStream, null);
                    }
                    catch (Exception)
                    {
                        return false;
                    }
                }
                return true;
            }
            public bool PutBlockList(string fileName, string[] blockIds)
            {
                try
                {
                    CloudBlockBlob blob = GetBlobkBlob(fileName);
                    blob.PutBlockList(blockIds);
                }
                catch (Exception)
                {
                    return false;
                }
                return true;
            }
        }

    To allow requesting large content, please add the maxStringContentLength attribute in config file.

        <system.serviceModel>
            <behaviors>
                <serviceBehaviors>
                    <behavior name="">
                        <serviceMetadata httpGetEnabled="true" />
                        <serviceDebug includeExceptionDetailInFaults="false" />
                    </behavior>
                </serviceBehaviors>
            </behaviors>
            <bindings>
                <basicHttpBinding>
                    <binding>
                        <readerQuotas maxStringContentLength="2147483647" />
                    </binding>
                </basicHttpBinding>
            </bindings>
            <serviceHostingEnvironment multipleSiteBindingsEnabled="true" />
        </system.serviceModel>

    And the client code to call that WCF service is:

            static void Main(string[] args)
            {
                int bufferSize = 40 * 1024; // 40 KB.
                byte[] bufferBytes = new byte[40960];

                List<string> blockIds = new List<string>();

                string filePath = "D:\\gpcoverlay-2459.zip"; // Please use your own file path.
                string fileName = Path.GetFileName(filePath);
                FileStream fileStream = File.OpenRead(filePath);

                int blockCount = (int)(fileStream.Length / bufferSize) + 1;
                Console.WriteLine("Block count: " + blockCount.ToString());

                for (int i = 0; i < blockCount; i++)
                {
                    fileStream.Read(bufferBytes, 0, bufferSize);
                    string currentBlockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(Guid.NewGuid().ToString()));
                    blockIds.Add(currentBlockId);

                    using (ServiceReference1.UploadClient client = new ServiceReference1.UploadClient())
                    {
                        if (client.PutBlock(fileName, currentBlockId, Convert.ToBase64String(bufferBytes)))
                        {
                            Console.WriteLine("Block {0} uploaded. {1}%", currentBlockId, ((i + 1) * 100 / blockCount));
                        }
                    }
                }

                using (ServiceReference1.UploadClient client = new ServiceReference1.UploadClient())
                {
                    if (client.PutBlockList(fileName, blockIds.ToArray()))
                    {
                        Console.WriteLine("File {0} uploaded.", fileName);
                    }
                }

                Console.ReadLine();
            }

    After the upload completed, you can check the uploaded file in the "files" blob contains by using storage explorer kind tools. For downloading files, you can use the similar way, but will use CloudBlob.OpenRead Method, Seek Method and Stream.Read Method for downloading bynary data and combine them into a huge file.

    If you need further assistance, please let me know.

    Thanks,


    Wengchao Zeng
    Please mark the replies as answers if they help or unmark if not.
    If you have any feedback about my replies, please contact msdnmg@microsoft.com.
    Microsoft One Code Framework
    • Marked as answer by Sujith.S Wednesday, May 18, 2011 12:43 PM
    Wednesday, May 18, 2011 9:05 AM
  • Hi Wengchao Zeng,

    Thank you very much. I will try and let you know if i need further clarifications.

    Thanks,

    Sujith


    S.Sujith
    Wednesday, May 18, 2011 12:17 PM
  • Hi Wengchao Zeng,

    Firstly Thank you very much for your guidance to upload huge files from blob.  I am new to azure platform. 

    Now I am facing problem for downloading huge files (images and textfiles) from blob. Please provide me some sample codes if possible so that I can check with my needs.

    Thanks in advance,

    Sujith.


    S.Sujith
    Tuesday, May 24, 2011 10:26 AM
  • Hi Sujith,

    As you have raised a new thread http://social.msdn.microsoft.com/Forums/en-US/windowsazuredata/thread/cfeb7e4b-9cca-4719-aed1-a92e86088f1f for this question. I'd answer the question in that thread.

    Thanks,


    Wengchao Zeng
    Please mark the replies as answers if they help or unmark if not.
    If you have any feedback about my replies, please contact msdnmg@microsoft.com.
    Microsoft One Code Framework
    Wednesday, May 25, 2011 8:41 AM
  • Hi Wengchao Zeng,

    Thank you very much for your reply.

    I will check with my needs and will reply to you.

    Regards,

    Sujith.S


    S.Sujith
    Friday, May 27, 2011 5:24 AM