Monday, January 28, 2013 6:43 AM
I am trying to Provision SQL Server for Data Sync to Sync a 12GB table to Azure. I am using the Sync Framework v2.1 for doing the same. When I tried to Setup and Sync smaller tables, the framework worked like charm. However, when I am trying to Sync this one huge 12GB table, it fails on setup with the following error. I tried to shrink the log file and then increased its limit to 20GB but still it does not fix the issue. Also, just the setup has been running for almost 30 min. I understand that the Sync would take time for a large table but why the setup process. Is that expected? Any pointers / suggestions is hugely appreciated.
Provisioning SQL Server for sync 1/27/2013 10:02:34 PM
System.Data.SqlClient.SqlException (0x80131904): The transaction log for database 'ICMODS' is full due to 'ACTIVE_TRANSACTION'.
at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection)
at System.Data.SqlClient.TdsParser.Run(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, Tds
at System.Data.SqlClient.SqlCommand.RunExecuteNonQueryTds(String methodName,Boolean async)
at System.Data.SqlClient.SqlCommand.InternalExecuteNonQuery(DbAsyncResult result, String methodName, Boolean sendToPipe)
at Microsoft.Synchronization.Data.SqlServer.SqlSyncTrackingTableHelper.PopulateFromBaseTable(SqlTransaction trans)
at Microsoft.Synchronization.Data.SqlServer.SqlSyncTableProvisioning.Apply(SqlTransaction trans)
at Microsoft.Synchronization.Data.SqlServer.SqlSyncScopeProvisioning.ApplyScope(SqlConnection connection)
at Microsoft.Synchronization.Data.SqlServer.SqlSyncScopeProvisioning.ApplyInternal(SqlConnection connection)
- Edited by CalvinTS Monday, January 28, 2013 7:04 AM
Monday, January 28, 2013 12:50 PMModeratorthe provisioning process will populate that tracking tables ( _tracking tables) for each table in the scope. if you have a million rows in your tables, there will be corresponding million rows in the tracking table. a 12GB table will generate a lot of transaction log activities and you might have to set the transaction log file size higher.
- Marked As Answer by CalvinTS Monday, January 28, 2013 6:14 PM
Monday, January 28, 2013 6:13 PM
Thanks June. However, what I find hard to understand is that the time is takes to setup about 20 tables with 2GB in total was around 6 to 7 min. So, I would expect a single table with 12GB to take about 40 min max. But this does not seem to be the case. The setup does not complete even after an hours time. Any thoughts?
I have a follow-up question on the same concerning the use of Fliter clause. But i will open a new thread for the same. Thanks again.
- Edited by CalvinTS Monday, January 28, 2013 8:37 PM
Tuesday, January 29, 2013 1:29 AMModeratorthat's not necessarily linear. your 20 tables would have required less memory/io, less transaction log activity vs a 12GB table that needs to be loaded in memory, inserted in tracking table and have corresponding log activities. the 12Gb would have required more locks as well.
Wednesday, January 30, 2013 6:03 PMThanks June, for the clarification.