locked
How Azure Backup storage and retention work RRS feed

  • Question

  • Hi, I've read that Azure backup is incremental, how it is managed?

    It's important to calculate the cost

    Let's use a daily backup with 7 days of retention time, 1000GB VM data and 5Gb of increment as an example.

    - Day 1 full backup of 1000GB

    - From Day 2 to Day 7 incremental of 5GB and I reach 1030GB of quota and the retention reach 7 days

    - Day 8 full backup 1030 (total amount of data is 2060GB couse if the first full is deleted I don't have 7 days of retention)

    - Day 9 to 14 incremental of 5GB, total amount of data is 2090GB and the retention is 14 days (even if I choose only 7)

    - Day 15 incremental of 5GB and deletion of the first full backup so I have 7 days of retention and data goes down to 1065GB

    Is it right or the process is different?

    Cheers

    Friday, March 4, 2016 5:10 PM

All replies

  • Hello,

     

    We are working on the query and would get back to you soon on this. I apologize for the inconvenience and appreciate your time and patience in this matter.

     

    Best Regards,

    Kamalakar K

    Saturday, March 5, 2016 2:16 PM
  • Hi Marco,

    The backup vault stores the backed up data that had been transferred up to the point of the cancellation.
    Azure Backup uses a checkpoint mechanism so that the backup data gets check-pointed occasionally during the backup and the next backup process can validate the integrity of the files.
    The next backup triggered would be incremental over the data that had been backed up previously.
    This provides better utilization of bandwidth, so that you do not need to transfer the same data repeatedly.

    This would mean that after the 1st backup, the data backed up from day to day 7, would only be the new data added in those 6 days and not the data that has already been backed up.

    Regards,
    Malar.

    Monday, March 7, 2016 7:54 AM
  • Hi Malar,

    ok from day 2 to 7, but what happens the day 8? the backup has to delete the first backup because is out of the retention time, but it is a full backup so it cannot be deleted.

    Cheers

    Marco

    Monday, March 7, 2016 2:43 PM
  • Hello Marco,

    Azure Backup does full back up only once, rest of the backups are always incremental. Also, azure backup does compression. Hence, in the below example, 1000 GB first full backup can be less than 1000 GB and there is no capability for second full backup.

    Further read @ FAQ

    Best Regards

    Sadiqh Ahmed

    ________________________________________________________________________________________________________________

    If a post answers your question, please click Mark As Answer on that post and Vote as Helpful.


    Monday, March 7, 2016 3:34 PM
  • incremental backups need a full backup to be restored. If only the first backup is a full, what happens after an year in a 5 days retention policy backup? It is impossibile that azure doesn't take a second full backup otherwise it cannot delete all the chain of incrementals that it took during the year.

    I've already read the FAQ's and the Q/A more similar to my question is this

    Q9. If each recovery point is like a full point, does it impact the total billable backup storage?
    A9. Typical long term retention point products store backup data as full points. However, these are storage inefficient but are easier and faster to restore. Incremental copies are storage efficient but require you to restore a chain of data which impacts your recovery time. Azure Backup’s unique storage architecture gives you the best of both worlds by optimally storing data for fast restores and incurring low storage costs. This approach ensures that your (ingress and egress) bandwidth is efficiently used, storage is kept to the minimum and the time taken to recover is kept to the minimum.

    But is a very generic answer, what does"Azure Backup’s unique storage architecture gives you the best of both worlds by optimally storing data for fast restores and incurring low storage costs" mean? , how this "unique storage architecture" works? does exist any documentation about this?

    Cheers

    Marco


    Wednesday, March 9, 2016 3:15 PM
  • Hello Marco,

    I think the best way to think about this is to use the Express full concept where changed data is merged into the baseline data-set to create a new (Express) full backup-set.  Your understanding is that where the full backup is created on day 1, then based on 7 day retention, the entire ‘full’ baseline data-set to be pruned on day 8.  However, that’s not how it works.  Part of (or likely *most of*) that original data-set from day 1 will be retained on day 8.  By merging the ‘day 1’ data with the ‘day 2’ changes, a new ‘full’ will be created as that entire data-set looked on day 2….and that is the new baseline for the day 3 thru day 8 incrementals.  On day 9, the baseline becomes the ‘full’ day 3 data-set as it looked on day 3…..and the process continues.

    Hope this answers your query!

    Best Regards

    Sadiqh Ahmed

    ________________________________________________________________________________________________________________

    If a post answers your question, please click Mark As Answer on that post and Vote as Helpful.

    Thursday, March 10, 2016 11:02 AM
  • Hi Sadiqh,

    Thanks for your explaination. As I understood this method limits the data transfer because it transfers only the size of an incremental, but the data stored (that we pay) is like a full backup everyday that with compression is reduced.

    It would be nice a document where this is explained in deep, but I cannot find it.

    Cheers

    Marco

    Thursday, March 10, 2016 11:23 AM
  • Hi,

    this is exactly the same question I have.

    Is it possible to monitor the used capacity within the backup vault?

    Regards

    Susi

    Tuesday, March 15, 2016 1:17 PM