Azure Archive Operations and pricing RRS feed

  • Question

  • We are looking to move approximately 1 PB of data to Archive Storage. One of the options we have been considering is GPV2 storage with Standard Performance and Archive access. I have some questions about pricing. I used the pricing calculator at:


    It states that 100,000 write operations are $1.10. The are write operations are PutBlob, PutBlock, PutBlockList, AppendBlock, SnapshotBlob, CopyBlob and SetBlobTier (when it moves a Blob from Hot to Cool, Cool to Archive or Hot to Archive).
    It states that 100,000 read operations are $55.00. The read operations are GetBlob, SetBlobTier (when it moves a Blob from Archive to Cool, Cool to Hot or Archive to Hot).

    I do not have Azure experience and I will have to explain the costs to our client. There are pages describing the above operations but I have not found examples of the operations of writing data.

    For example, if I wanted to upload one hundred 100 MB files, what would be the operations involved (PutBlob, PutBlock)? How many of the operations would be involved and what would the price be at $1.10 for 100,000 operations.

    Likewise, what operations would be involved in reading one hundred 100 MB files? How many operations would be involved and what would the price be at $55.00 for 100,000 operations.
    Friday, March 13, 2020 10:56 PM

All replies

  • Pls refer to https://social.msdn.microsoft.com/Forums/en-US/b2fea34d-2fa2-404b-9f5b-0eae43c5828e/azure-storage-operations-and-pricing-query?forum=windowsazuredata


    Friday, March 13, 2020 11:05 PM
  • @LorenZ_DA Just checking in to see if the above answer helped. If this answers your query, do click “Mark as Answer” and Up-Vote for the same, which might be beneficial to other community members reading this thread. And, if you have any further query do let us know
    Monday, March 16, 2020 4:30 AM
  • Hi Marcin and SumathMarigowda,

    The link is very helpful. I need help determining the number of operations it will take in order to give our client a good idea on the cost of storing and retrieving data from archive. Forgive me in that I am just learning this as well. In the link SamanthMarigowda wrote:

    • A single GetBlob request to the blob service = 1 transaction
    • PutBlob with 1 request to the blob service = 1 transaction
    • Large blob upload that results in 100 requests via PutBlock, and then 1 PutBlockList for commit = 101 transactions
    • Listing through a lot of blobs using 5 requests total (due to 4 continuation markers) = 5 transactions

    The data that will be archived is large image files as well as contract documents. From my reading, block blobs would probably be optimum for archival. Is this right?

    In a simple scenario, I would like to upload a 100 MB imagery file. I would have to have an empty blob in a container and use set blob tier to set it to either hot or cool. I would then copy the file and then use set blob tier again to change it to archive. How would Azure handle the copy? How many blocks would the file be broken into? I have read that each putblock will be counted as one transaction and the final PutBlocklist will be counted as one transaction. How many operations will be required? If the price is $1.10 for 100,000 operations what kind of cost can I estimate? Also, what would be the cost for changing the tier from hot or cool to archive?

    The more expensive task is reading the data. After 180 days and the client wants to read the data. The blob within the container would have to be set from archive to hot or cool right? It will then take time to rehydrate the data. What will be the cost? How will the file be handled when reading the data and how many get blob operations will be required? What other operations will be needed. Looking at the Azure pricing site, it says the cost is $55.00 for a 100,000 operations. As reading is supposed to be much more expensive, I suppose there will be a large number of operations.

    For organizing the data containers would be necessary. Any help on container creation/deletion costs would be helpful.

    • Edited by LorenZ_DA Tuesday, March 17, 2020 3:38 PM
    Tuesday, March 17, 2020 4:16 AM
  • I would recommend you to contact Azure Pricing/Billing team for more specialized assistance on this query billing and subscription management  team will provided detailed information with the executive summary  and supporting details of the analysis's and it's free support, and it's the best choice for your scenario.


    If the propose answer was useful please remember to "Upvote" and "Mark as Answer" this will benefit to other community member 

    Wednesday, March 18, 2020 2:18 PM
  • I browsed that page. I could not find any free support. The site states that if you do not have a support plan then refer to the free resources at the bottom of the page and none of them addresses the my questions. I may be missing something. Would you have any directions on contacting the Azure Pricing/Billing team?
    Wednesday, March 18, 2020 10:47 PM
  • Hello Marcin,

    As I am Canada, I have been referring to the Azure pricing calculator here:


    This should give adequate information to get an estimate for the client. My apologies in that I am writing the business case and do not have Azure experience. The client would like to see pricing examples before making a decision.

    I would like to run by 2 scenarios and need confirmation if they are right and a correction if I am wrong.

    Writing Data to Archive

    As stated, this will be cheapest. In this example we have one hundred 100 MB files on a local filesystem that we want copied to Azure Archive. With Storage REST API version 2019-02-02 we can upload them directly to Archive so no Data Write would be needed. According to the pricing calculator, the cost is $1.10 for 100,000 operations. the one hundred files would be saved as one hundred blobs. Each blob would be broken into 4 MB blocks.

    To calculate the price we need to find the number of operations. For each blob the operation count would be 100MB/4MB + 1 = 26. For 100 of these files the count would be 2600. The final price would be $1.10 x 2600/100,000 = $0.0286. This is very cheap to me. Is this correct?

    Reading Data from Archive

    This is supposed to be more expensive. In this example more than days have passed and we want to copy the same one hundred 100 MB files from archive to a local filesystem. I have read that this would be a combination of data read and data retrieval. Data retrieval will need to be done in order to rehydrate the data. This involves changing the blobs from archive to hot storage. The pricing calculator states that the cost of data retrieval is $0.022 per GB. As this is 10 GB the price would be $0.22. Copying the data to a local filesystem will involve read operations. According to the pricing calculator read operations are $55.00 per 100,000 operations. The cost would be: $55.00 X 2600/100,000 = $1.43.

    Final cost is $0.22 + $1.43 = $1.65

    Is this correct? I was expecting higher prices.

    • Edited by LorenZ_DA Thursday, March 26, 2020 10:40 PM
    Thursday, March 26, 2020 10:16 PM
  • @LorenZ_DA Apologies for delay in responding here! I also see you have posted similar query in the SO forum.

    If you still finding any difficulties please let us know 

    Wednesday, April 15, 2020 5:22 PM
  • Since we have not heard back from you, If the issue still need any information I would like to work closer on this issue I would recommend you to contact support, so If you have a support plan, I request you file a support ticket, else please do let us know, we will try and help you get a one-time free technical support. In this case, could you send an email to AzCommunity[at]Microsoft[dot]com referencing this thread as well as your subscription ID. Please mention "ATTN subm" in the subject field. Thank you for your cooperation on this matter and look forward to your reply.

    Thursday, May 7, 2020 10:18 AM