none
Moving Batch of FIles from Azure Storage to Azure FIleShare RRS feed

  • Question

  • Hi,

    I have a scenario where I need to copy the Files which are older than 3 days from my Azure VM to Azure FIleShare.I used AZ Copy in my Scenario and it looks Like below

    $Date=(Get-Date).ToString('yyyy-MM-dd')
    
    $logname = "Log_$Date"
    
    $Logfile = "filesystem::G:\ds\$logname.log"
    If(Test-Path $Logfile)
    {
    Remove-Item $Logfile
    }
    
    Function LogWrite
    {
       Param ([string]$logstring)
       $DateTime = (Get-Date -format s)
       Add-content $Logfile -value $DateTime":"$logstring
    }
     
    $time = (Get-Date).AddDays(-2)
    $Sourcepath ="filesystem::\\Mdfg\e$\Src"
    
    $files =Get-ChildItem  -Path:$Sourcepath  –Recurse | Where-Object {! $_.PSIsContainer -and $_.LastWriteTime -lt $time} | select FullName
     
    foreach ($file in $files)
    {
    $fileName= $file -split("=")
    $fileName1= $fileName[1] -split("}")
    $fileNamePath=$fileName1[0]
     
    try
    {
    
     
    $outputFile = Split-Path $fileNamePath -leaf
     
    $outputPath = Split-Path $fileNamePath
    
    $Dst="https://xx.file.core.windows.net/xx/"
    
    if($outputPath -ne $Sourcepath.Replace("filesystem::",""))
    {
    $DstTemp=$outputPath.Replace($Sourcepath.Replace("filesystem::",""),"")
    
    $Dst =$Dst+$DstTemp.replace('\','/').TrimStart('/')+'/'
    
    }
    
    
    $TransferResult= & "C:\Program Files (x86)\Microsoft SDKs\Azure\AzCopy\AzCopy.exe" /Source:$outputPath /Dest:$Dst /DestKey:xxxxx== /Pattern:$outputFile  /XO /XN
    ## write-host $TransferResult
    [int]$fileTransfer = $TransferResult[2].Split(":")[1].Trim()
     [int]$fileSuccess = $TransferResult[3].Split(":")[1].Trim()
     
    [int]$fileSkipped = $TransferResult[4].Split(":")[1].Trim()
     
    [int]$fileFailed = $TransferResult[5].Split(":")[1].Trim()
     
    if($fileTransfer -gt 0 -and  $fileSuccess -gt 0 )
    {
    LogWrite "Successfully Copied $fileNamePath"
    $fileNamePath= "filesystem::"+$fileNamePath
    Remove-item -path $fileNamePath
    $filenamepath1=$fileNamePath.Replace("filesystem::","")
    LogWrite "Successfully Deleted $filenamepath1"
    }
     
    ElseIf($fileTransfer -eq 0 -and  $fileSuccess -eq 0 -and $fileSkipped -eq 0 -and  $fileFailed -eq 0  )
    {
    LogWrite "File is already Present: $fileNamePath"
    }
     
    else
    {
      LogWrite "There is a Problem in Copy Please find the below Log"
      LogWrite $TransferResult
     
    }
     
      }
       catch
            {
                $errorMessage = $error[0].Exception.Message
    LogWrite $errorMessage
     
            }
    }

    But I have around 50 K Files and its working but its taking more than 10hrs to copy and then Delete.

    is there an alternate way by using other cmdlets also its fine or any suggestion you could give watching the script?

    Thanks,

    Sujith.


    Sujith

    Thursday, August 10, 2017 2:18 PM

All replies

  • You may refer the below article to move data from the Azure VM to a Azure file share:

    Migrating Data to Microsoft Azure Files

    Do click on "Mark as Answer" on the post that helps you, this can be beneficial to other community members.

    • Proposed as answer by vikranth s Sunday, August 13, 2017 12:40 PM
    Thursday, August 10, 2017 7:38 PM
  • Hi Sujith,

    The best way to copy large amount of data is to use DFSR, first you need to pre stage the data with Robocopy and once the data is copied from local file server to Azure, you can start the DFSR and wait for the initial sync, once the sync is completed you can use dfs namespace to point the users to the azure share and later take off the on premise file share. The speed of the data copy will depend on how much throttle you use and what is your network bandwidth.

    This method will not have any downtime for the end users.

    • Proposed as answer by vikranth s Sunday, August 13, 2017 12:40 PM
    Friday, August 11, 2017 2:28 PM