locked
Back up a farm RRS feed

  • Question

  • I try to implement back up function for a farm by using Central Administration. Some errors occurred at SharePoint_Config-Objekt in OnBackup Event. Some .bak file (e.g. 000000B6.bak, 000000B7.bak, 000000BA.bak, 000000BC.bak ......  ) can not be found under the folder spbr0002.

    Anybody has ideas? Thanks!
    Tuesday, February 23, 2010 5:21 PM

Answers

  • Yes, use a UNC path and make sure that location is SHARED so that the SP and SQL servers can all write to it.
    SharePoint Architect || My Blog
    • Marked as answer by hulu0808 Monday, June 21, 2010 4:56 PM
    Wednesday, March 24, 2010 2:58 AM
  • You are using "e:\moss2010\backup" as your drive. You should use a UNC path. Use windows to share this location out - so you can use \\boxname\backup instead.
    • Marked as answer by hulu0808 Monday, June 21, 2010 4:56 PM
    Friday, March 26, 2010 6:47 PM

All replies

  • If your farm is on multiple farms make sure you are using a UNC path to a share. E.g. \\backupserver\sharepointbackups

    If you are using a local path c:\backup, SQL will try and put all the files on THEIR local boxes.

    Tuesday, March 23, 2010 9:24 PM
  • Yes, use a UNC path and make sure that location is SHARED so that the SP and SQL servers can all write to it.
    SharePoint Architect || My Blog
    • Marked as answer by hulu0808 Monday, June 21, 2010 4:56 PM
    Wednesday, March 24, 2010 2:58 AM
  • In back up of web application only one .bak (e.g. 000000BB.bak) file can not be found. Other .bak files are already there. During farm back up more .bak files are not available. I think that is not the problem with path.
    Wednesday, March 24, 2010 12:58 PM
  • which user or group should be assigned to back up ordner so that the SP and SQL servers can write to it? Only a few .bak files are missing not all of them. Is that really the authorization problem?
    Wednesday, March 24, 2010 1:06 PM
  • Can you post at least part of the backup.log file (just need the error message and stack trace).
    Wednesday, March 24, 2010 2:12 PM
  • I use but the german version. I try to translate them to english. A few warning and one error in Log.

     

    [24.03.2010 13:09:12] Verbose: [WSS_Content] SQL Server-Verbindungszeichenfolge: Data Source=10.156.84.207;Initial Catalog=WSS_Content;Integrated Security=True;Enlist=False;Connect Timeout=15.
    [24.03.2010 13:09:12] Verbose: [WSS_Content] SQL-Befehl gestartet um: 24.03.2010 13:09:12. Es kann eine Weile dauern, bis dieser Befehl abgeschlossen ist, und es wird keine Benachrichtigung angezeigt.
    [24.03.2010 13:09:12] Verbose: [WSS_Content] SQL Server-Befehl: BACKUP DATABASE [WSS_Content] TO DISK=@db_loc WITH NAME=@db_name, STATS=5, NOINIT, NOSKIP, NOFORMAT, NOREWIND
     @db_name=WSS_Content, @db_loc=E:\MOSS2010\backup\Webanwendung\spbr0000\000000BB.bak
    [24.03.2010 13:09:12] Verbose: [WSS_Content] Der SQL-Befehlstimeout ist auf 5.60 Stunden festgelegt.
    [24.03.2010 13:09:12] Verbose: Startobjekt: ExpirationProcessing.
    [24.03.2010 13:09:13] Progress: [ExpirationProcessing] 50 Prozent abgeschlossen.
    [24.03.2010 13:09:14] Verbose: Startobjekt: SchedulingUnpublish.
    [24.03.2010 13:09:15] Warning: [WSS_Content] Das Sicherungsmedium 'E:\MOSS2010\backup\Webanwendung\spbr0000\000000BB.bak' kann nicht geöffnet werden. Betriebssystemfehler 3(Das System kann den angegebenen Pfad nicht finden.).  (xxx.bak can not be opened. operating system error 3. the specified path can not be found)
    BACKUP DATABASE wird fehlerbedingt beendet. (BACKUP DATABASE has with error broken)
    [24.03.2010 13:09:15] Progress: [SchedulingUnpublish] 50 Prozent abgeschlossen.
    [24.03.2010 13:09:15] Verbose: Startobjekt: job-solution-daily-resource-usage.
    [24.03.2010 13:09:17] Debug: [WSS_Content]    bei System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)
       bei System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj)
       bei System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj)
       bei System.Data.SqlClient.SqlDataReader.ConsumeMetaData()
       bei System.Data.SqlClient.SqlDataReader.get_MetaData()
       bei System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString)
       bei System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async)
       bei System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, DbAsyncResult result)
       bei System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)
       bei System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)
       bei System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior)
       bei Microsoft.SharePoint.Administration.Backup.SPSqlBackupRestoreHelper.RunCommand(SqlCommand sqlCommand, SPBackupRestoreInformation args, Boolean throwOnRestart, Boolean& restart, SPSqlBackupRestoreConnection connection)
    [24.03.2010 13:09:17] Progress: [job-solution-daily-resource-usage] 50 Prozent abgeschlossen.
    [24.03.2010 13:09:17] Warning: [WSS_Content] Fehler im SQL-Befehl. Der Befehl muss neu gestartet werden. Er wird insgesamt drei Mal neu gestartet, bevor eine Ausnahme ausgelöst wird. (error in SQL command. the command must be started again. It will be restarted for 3 times before throwing the exception  )

     

    [24.03.2010 13:09:51] Verbose: [WSS_Content] Es wird versucht, erneut eine SQL Server-Verbindung herzustellen. (It tries to connect to SQL Server)
    [24.03.2010 13:09:51] Verbose: [WSS_Content] SQL-Befehl gestartet um: 24.03.2010 13:09:51. Es kann eine Weile dauern, bis dieser Befehl abgeschlossen ist, und es wird keine Benachrichtigung angezeigt.
    [24.03.2010 13:09:51] Verbose: [WSS_Content] SQL Server-Befehl: BACKUP DATABASE [WSS_Content] TO DISK=@db_loc WITH NAME=@db_name, STATS=5, NOINIT, NOSKIP, NOFORMAT, NOREWIND
     @db_name=WSS_Content, @db_loc=E:\MOSS2010\backup\Webanwendung\spbr0000\000000BB.bak
    [24.03.2010 13:09:51] Verbose: [WSS_Content] Der SQL-Befehlstimeout ist auf 5.60 Stunden festgelegt.
    [24.03.2010 13:09:51] Verbose: Startobjekt: VariationsCreateSite.
    [24.03.2010 13:09:52] FatalError: Fehler beim WSS_Content-Objekt im OnBackup-Ereignis. Weitere Informationen finden Sie im Fehlerprotokoll im Sicherungsverzeichnis. (Error in WSS_Content-Objekt in OnBackup event. you can find more information in log.)
     SqlException: Das Sicherungsmedium 'E:\MOSS2010\backup\Webanwendung\spbr0000\000000BB.bak' kann nicht geöffnet werden. Betriebssystemfehler 3(Das System kann den angegebenen Pfad nicht finden.). (xxx.bak can not be opened. operating system error 3. the specified path can not be found)
    BACKUP DATABASE wird fehlerbedingt beendet. (BACKUP DATABASE has with error broken)

    Wednesday, March 24, 2010 3:04 PM
  • You are using "e:\moss2010\backup" as your drive. You should use a UNC path. Use windows to share this location out - so you can use \\boxname\backup instead.
    • Marked as answer by hulu0808 Monday, June 21, 2010 4:56 PM
    Friday, March 26, 2010 6:47 PM
  • Thank you! It works!

    I have still a question. I try to back up the whole Farm. But it displays error that it needs more space - over 600GB. Does it really need so much space?

    Monday, March 29, 2010 1:05 PM
  • That depends, how much content do you have? You can use the following command in the SharePoint 2010 Management Shell to see how large your Content Database are:

     Get-SPContentDatabase | format-table Name, DiskSizeRequired -auto

    You still have things like your service application databases, and your index file to account for. But I'm guessing your content databases make up the bulk of what you're trying to back up.

    tk

    Monday, March 29, 2010 3:01 PM
  • You can also use stsadm -o backup -force to have it skip the check for diskspace.

     

    The problem is that there are two different disk space measurements you can use:

    1. The space reserved by SQL on disk;

    2. The actual disk space taken up by SQL

    TWO is always smaller than one, but can be expensive to calculate and the actual backup size is usually smaller than both of these. On average the size is off by about 10%.

    Monday, March 29, 2010 4:31 PM
  • I've improved on this PowerShell snippet. Use this instead:

    Get-SPDatabase | sort-object disksizerequired -desc | format-table Name, @{Label ="Size in MB"; Expression = {$_.disksizerequired/1024/1024}}

    That will get all of your databases, no just Content databases, and it will give you the numbers in megabytes, which is much better than bytes.

    tk

    Monday, March 29, 2010 8:28 PM
  • Ich have checked the Content databases which are not too large (circa 20-30 GB). But the Config database - SharePoint_Config is very large (560GB). Is it normal?
    Tuesday, March 30, 2010 1:26 PM
  • Ich have checked the Content databases which are not too large (circa 20-30 GB). But the Config database - SharePoint_Config is very large (560GB). Is it normal?


    560 GB is awfully large. 560 MB is not out of line. If it truly is 560 GB then I'd verify which tables are large by doing a "Disk Usage by Top Tables" and see which table is so large. Also, how did you check the Config database size? With PowerShell or through SQL? there are different numbers. Some include white space, some include transaction logs, etc.

    tk

    Tuesday, March 30, 2010 3:21 PM
  • Ich have checked the Content databases which are not too large (circa 20-30 GB). But the Config database - SharePoint_Config is very large (560GB). Is it normal?


    560 GB is awfully large. 560 MB is not out of line. If it truly is 560 GB then I'd verify which tables are large by doing a "Disk Usage by Top Tables" and see which table is so large. Also, how did you check the Config database size? With PowerShell or through SQL? there are different numbers. Some include white space, some include transaction logs, etc.

    tk


    Todd, is that even possible for the CONFIG database to be that large in any normal scenario?  How could that be?  It's not even a content database, and his content is only 20-30GB per CDB.  This sounds like a big problem...
    SharePoint Architect || My Blog
    Tuesday, March 30, 2010 3:28 PM
  • Ich have checked the Content databases which are not too large (circa 20-30 GB). But the Config database - SharePoint_Config is very large (560GB). Is it normal?


    560 GB is awfully large. 560 MB is not out of line. If it truly is 560 GB then I'd verify which tables are large by doing a "Disk Usage by Top Tables" and see which table is so large. Also, how did you check the Config database size? With PowerShell or through SQL? there are different numbers. Some include white space, some include transaction logs, etc.

    tk


    Todd, is that even possible for the CONFIG database to be that large in any normal scenario?  How could that be?  It's not even a content database, and his content is only 20-30GB per CDB.  This sounds like a big problem...
    SharePoint Architect || My Blog


    I won't say impossible, but I would certainly say improbable. That's why I asked for clarification. :) I have a SP2010 Config DB that's currently 1.2 GB, but that's a far cry from 560 GB.

    tk

    Tuesday, March 30, 2010 4:38 PM
  • Config database:

    with PowerShell according to your suggestion (Get-SPDatabase | sort-object disksizerequired -desc | format-table Name, @{Label ="Size in MB"; Expression = {$_.disksizerequired/1024/1024}}):  570218 MB

    with MS SQL Server Management Studio:  6001 MB

    Backup with centeral admin with error : not enough space, needs 641,57 GB

    Backup with Commando without checking for diskspace: stsadm -o backup -Directory \\xxxx -backupmethod full -force, it works but with error (about search-service, I have current problem with searching). The total Logfiles are only  365MB

    Why does report display that it needs 641GB?

     

    Wednesday, March 31, 2010 4:33 PM