none
On-Prem Archive to Azure

    Question

  • Hello,

    I am trying to understand the various Azure Storage options available to support the following scenario.

    I have around 27TB or archived data which consist of file and folders.  Users access this data in a read-only state on-prem.  On a monthly basis we have various procedures that writes additional data to this server/storage for closed projects.  The content needs to remain on-prem for accessing in a read-only state.  Every-month we perform full backups to tape and send off-site in the event we loose the on-prem dataset in a disaster.

    I would like to eliminate the use of tapes all together and send the data to Azure.  We are currently using DPM and sending production data to Azure.  The archive data I am looking to send to azure is for protection only.  Reading of the archive data will take place on-prem.

    I trying to figure out what option would best suite this archive data.  I am unsure if the blob storage option, cold storage, would work for this data.  Do I look at doing a StoreSimple appliance?  Also, what recovery options are available?  In the event of a disaster of my on-prem data and I need to pull all this data back, what options would allow me to send physical drives to Azure to have them restore and send and what is the cost associated to that.

    Based on some of the calculators I am unclear what a "Put Operations" are referring to.  I am trying to get a solid understand of the costs associated to moving this data from tape to Azure storage.

    The other things is sending 27+TB to Azure.  Is their a more cost effective option to seed this data to Azure.

    I greatly appreciate any suggestions and input.

    Thank You

    Monday, February 20, 2017 4:09 PM

Answers

  • You can use Microsoft Azure Import/Export service for transferring substantial amount of data (27+ TB in your case) to Azure Blob storage by shipping HDDs to an Azure Data center.

    You can recover your data stored in Blob storage and have it delivered to your on-premises location in the unfortunate event of a disaster at your on-premises site.

    See this article for detailed information on Azure Import/Export Pricing.

    Alternatively, StorSimple can also be a good option depending on the kind of infrastructure set-up in your on-premises. You may review the requirements for a StorSimple Virtual Array here.

    Also, refer the architecture guide for archiving on-premises data to Azure Cloud for a broader understanding of the solutions offered.

    • Proposed as answer by Md Shihab Tuesday, February 21, 2017 7:31 AM
    • Marked as answer by IT_Vision Friday, February 24, 2017 7:34 PM
    Tuesday, February 21, 2017 7:28 AM
  • Some of the benefits of using the StorSimple Virtual array is that it provides cloud backup, fast restore, item-level recovery and disaster recovery.

    You wouldn’t just have a single blob if you’re storing different items from your on-premises data. For example, if you have ten text files on-premises and you copy all of those into your Storage account then you can place them all within a single or multiple containers and each file would be a separate blob. You can refer this document for more information on Blob Storage.

    Since your data is quite huge have you considered using Microsoft Azure Import/Export service?

    • Marked as answer by IT_Vision Friday, February 24, 2017 7:34 PM
    Thursday, February 23, 2017 8:12 AM

All replies

  • You can use Microsoft Azure Import/Export service for transferring substantial amount of data (27+ TB in your case) to Azure Blob storage by shipping HDDs to an Azure Data center.

    You can recover your data stored in Blob storage and have it delivered to your on-premises location in the unfortunate event of a disaster at your on-premises site.

    See this article for detailed information on Azure Import/Export Pricing.

    Alternatively, StorSimple can also be a good option depending on the kind of infrastructure set-up in your on-premises. You may review the requirements for a StorSimple Virtual Array here.

    Also, refer the architecture guide for archiving on-premises data to Azure Cloud for a broader understanding of the solutions offered.

    • Proposed as answer by Md Shihab Tuesday, February 21, 2017 7:31 AM
    • Marked as answer by IT_Vision Friday, February 24, 2017 7:34 PM
    Tuesday, February 21, 2017 7:28 AM
  • Hello Md,

    Thank you very much for the response. Since we already have an on-prem virtual machine as our file server hosting the archive content, is there really any benefit to implementing a StoreSimple Virtual Array as a file server?

    I am think that I can utilize my existing VM and use the AzCopy command to send data to the Azure Cold Storage Blob.

    From a Storage "Blob" perspective.  Is it correct to say I would have a single "Storage Blob" to hold this archive content?  My reason for asking is I also see various commands like, downloading a single blob, downloading all blobs.  Just trying to grasps the concept of "Blobs".

    Thank You

    Tuesday, February 21, 2017 2:26 PM
  • Some of the benefits of using the StorSimple Virtual array is that it provides cloud backup, fast restore, item-level recovery and disaster recovery.

    You wouldn’t just have a single blob if you’re storing different items from your on-premises data. For example, if you have ten text files on-premises and you copy all of those into your Storage account then you can place them all within a single or multiple containers and each file would be a separate blob. You can refer this document for more information on Blob Storage.

    Since your data is quite huge have you considered using Microsoft Azure Import/Export service?

    • Marked as answer by IT_Vision Friday, February 24, 2017 7:34 PM
    Thursday, February 23, 2017 8:12 AM