none
Windows Azure deployment data loss

    Question

  • Hi,

     

    We have publish web application and used windows azure hosting service for deployment.

    We have functionality to upload the file. We have added one directory folder in the solution, where we store all the upload files.

    This functionality is working fine, But now i found that during some period of time windows azure instance automatically refreshed. Because of that we lost all uploaded documents. 

    We are not using storage services to uploading the document. Because our application is already developed , we only deploying our application into cloud therefore we used windows azure portal.

    Could any one please help me out of this isssue.

    How can i get back my uploaded document and also we wanted to know how to avoid this scenario.

    Saturday, October 01, 2011 11:19 AM

Answers

  • I'm sorry to be the one to tell you this, but I don't think you'll be able to get back your uploaded documents. To avoid this scenario recurring, you'll need to store your data differently.

    The local disk storage of Compute VMs (whether Web Role, Worker Role, or VM Role) is not persistent. It can go away at any time. The data center has the right to move and re-create your VMs whenever it deems it necessary. This could happen in response to a hardware failure, or simply because the data center needs to be reorganized. When this happens, you lose your VM disk files and go back to your deployment image. It is only a matter of time before this happens. This is normal behavior for cloud computing compute instances.

    How do you keep your data in the cloud persistent? It's best to use a persistent, managed service such as Windows Azure Storage or SQL Azure database. That gives you storage you can depend on, backed by triple redundancy.

    If you are unable to to move to persistent storage, another approach that may work (if the data files are read-only) is to add the data files to your deployment package and upgrade your VM instances with the package.


    David Pallmann GM Application Development, Neudesic Windows Azure MVP
    Saturday, October 01, 2011 11:57 AM
  • Hi,

    Please set "Copy to Output Directory" to true for all those files. But note this approach does not work if you need to let your clients to upload files dynamically. To support that scenario, you have to use Windows Azure storage.
     
    Drive storage will help you to to migrate existing code to Windows Azure, as it allows you to use standard NTFS API to work with files, while upload the files to blob storage automatically in the background. But note only a single instance can use a writable drive at a time. If you need to let multiple instances write to Azure storage simultanously, you'll have to use blob storage directly.

     

    Best Regards,

    Ming Xu.


    Please mark the replies as answers if they help or unmark if not.
    If you have any feedback about my replies, please contact msdnmg@microsoft.com.
    Microsoft One Code Framework
    Wednesday, October 05, 2011 11:37 AM
    Moderator

All replies

  • I'm sorry to be the one to tell you this, but I don't think you'll be able to get back your uploaded documents. To avoid this scenario recurring, you'll need to store your data differently.

    The local disk storage of Compute VMs (whether Web Role, Worker Role, or VM Role) is not persistent. It can go away at any time. The data center has the right to move and re-create your VMs whenever it deems it necessary. This could happen in response to a hardware failure, or simply because the data center needs to be reorganized. When this happens, you lose your VM disk files and go back to your deployment image. It is only a matter of time before this happens. This is normal behavior for cloud computing compute instances.

    How do you keep your data in the cloud persistent? It's best to use a persistent, managed service such as Windows Azure Storage or SQL Azure database. That gives you storage you can depend on, backed by triple redundancy.

    If you are unable to to move to persistent storage, another approach that may work (if the data files are read-only) is to add the data files to your deployment package and upgrade your VM instances with the package.


    David Pallmann GM Application Development, Neudesic Windows Azure MVP
    Saturday, October 01, 2011 11:57 AM
  • Thank you David.
    Monday, October 03, 2011 1:06 PM
  • Hi David,

    >>>If you are unable to to move to persistent storage, another approach that may work (if the data files are read-only) is to add the data files to your deployment package and upgrade your VM instances with the package.

    I have tried to add folder and files in our web application project and we publish using cloud project.

    But the folders and files are not included in deployment package (.scpkg and .cscfg).

     

    Could you please help me to resolve this issue.

     

    Wednesday, October 05, 2011 7:15 AM
  • Hi,

    Please set "Copy to Output Directory" to true for all those files. But note this approach does not work if you need to let your clients to upload files dynamically. To support that scenario, you have to use Windows Azure storage.
     
    Drive storage will help you to to migrate existing code to Windows Azure, as it allows you to use standard NTFS API to work with files, while upload the files to blob storage automatically in the background. But note only a single instance can use a writable drive at a time. If you need to let multiple instances write to Azure storage simultanously, you'll have to use blob storage directly.

     

    Best Regards,

    Ming Xu.


    Please mark the replies as answers if they help or unmark if not.
    If you have any feedback about my replies, please contact msdnmg@microsoft.com.
    Microsoft One Code Framework
    Wednesday, October 05, 2011 11:37 AM
    Moderator