Cannot use the azcopy tool to upload big size file to the Azure RRS feed

  • Question

  • Hi All,

    We have purchased the global Azure product. I come from China, we have to upload our database backup files to the Azure. the total size is about 350 GB. we know that there is a tool called azcopy.exe.

    We use the tool to upload a 1kb file is ok, just fine. But when we use a 200MB file to test the upload speed, the tool is always show 0kb, no speed, and failed after 15 minutes.

    So we have a problem about uploading data from local to the Azure, we do not know the reason why speed is 0 using azcopy tool. Can anybody give us some suggestions? We prefer to use the Azcopy tool.


    Jason Huo

    Wednesday, September 9, 2015 1:56 PM


All replies

  • Hi Jason,

    Looks like you have a Time-Out issue. This might occur when you are copying large files.

    You could resume the transfer by running the same .\azcopy.exe command or you could use a script like the one below:

    [int]$Failed = $Result[5].Split(":")[1].Trim() $Journal = "$env:LocalAppData\Microsoft\Azure\AzCopy" $i=1 while ($Failed -gt 0) {     $i++     [int]$Failed = $Result[5].Split(":")[1].Trim()     $Result = .\AzCopy.exe /Z:$Journal     $Result     $i }

    Please refer the following link for more details about the script: http://social.technet.microsoft.com/wiki/contents/articles/26528.resuming-timed-out-uploadsdownloads-tofrom-azure-blob-storage-with-azcopy.aspx

    However, you may also use Azure Storage explorer for uploading/downloading the files.


    Thursday, September 10, 2015 10:27 AM
  • Nagamalar Nagarajan already answered your question, here is just some supplement:

    If the timeout issue keeps occurring in your machine, your network connection quality with global Azure might be awful. Please consider checking your proxy settings, creating the storage account in a region near China such as East Asia, or specifying a smaller /NC value in AzCopy command line (the default value is CPU core number * 8).

    For further information about AzCopy, please refer to http://aka.ms/azcopy .

    Thursday, September 10, 2015 2:47 PM
  • I'm not sure if you still have this issue, but as far as I know if you are copying a file, the largest you can copy is the maximum BLOCK BLOB size of 195GB. I suggest you export your database dumps to multiple files so no single file is over 195GB.

    The other option is create a VHD and mount as a drive on your DB server, then export your database dump to that disk. Unmount the drive and use Azcopy to copy the VHD file. You can then mount that VHD on the new server as a drive and restore/import your database. A VHD is a PAGE BLOB so the limit on this is 1TB so your database will sit comfortably on this.

    Another option is to use Azure File Shares to upload your file and then you can download to any VM within the region.

    • Edited by PM76 Friday, December 18, 2015 10:28 AM
    Friday, December 18, 2015 10:21 AM