none
Sync Services for ADO.Net Devices (Batching) Problem - Stuck in what appears to infinite loop.

    Question

  • Hello,

    I'm trying to implement batching from WCF Service to windows mobile device using custom tracking.  I took the code example from syncguru's web site and have tried it but it seems to be stuck in a loop somehow.  In my program I'm only syncing only certain groups of tables at different times throughout the program.  Have I done something wrong or is there a better example out there.  The SQL Procedure and anchor command are below.  Any help would be greatly appreciated.  Thanks.

    CREATE procedure dbo.sp_new_batch_anchor (
            @sync_last_received_anchor timestamp , 
            @sync_batch_size int,
            @sync_max_received_anchor timestamp output,
            @sync_new_received_anchor timestamp output,           
            @sync_batch_count int output)       
    as            
    		
          if @sync_batch_size <= 0 or @sync_batch_size is null
                set @sync_batch_size = 1000     
          if @sync_max_received_anchor is null
              set @sync_max_received_anchor = @@DBTS --min_active_rowversion()-1 
          -- simplest form of batching
          if @sync_last_received_anchor is null or @sync_last_received_anchor = 0
          begin             
              set @sync_new_received_anchor = @sync_batch_size
                if @sync_batch_count <= 0
                      set @sync_batch_count = (@sync_max_received_anchor /  @sync_batch_size) + 1
          end
          else
          begin
              set @sync_new_received_anchor = @sync_last_received_anchor + @sync_batch_size
                if @sync_batch_count <= 0
                      set @sync_batch_count = (@sync_max_received_anchor /  @sync_batch_size) -
                                              (@sync_new_received_anchor /  @sync_batch_size) + 1
          end         
        -- check if this is the last batch        
        if @sync_new_received_anchor >= @sync_max_received_anchor
        begin
            set @sync_new_received_anchor = @sync_max_received_anchor        
                if @sync_batch_count <= 0
                      set @sync_batch_count = 1
        end   
        
     
    go

     Dim anchorCmd As New SqlCommand
                anchorCmd.CommandType = CommandType.StoredProcedure
                anchorCmd.CommandText = "sp_new_batch_anchor"
                anchorCmd.Parameters.Add("@" + SyncSession.SyncMaxReceivedAnchor, SqlDbType.Timestamp).Direction = ParameterDirection.InputOutput
                anchorCmd.Parameters.Add("@" + SyncSession.SyncLastReceivedAnchor, SqlDbType.Timestamp)
                anchorCmd.Parameters.Add("@" + SyncSession.SyncBatchSize, SqlDbType.Int)
                anchorCmd.Parameters.Add("@" + SyncSession.SyncNewReceivedAnchor, SqlDbType.Timestamp).Direction = ParameterDirection.Output
                anchorCmd.Parameters.Add("@" + SyncSession.SyncBatchCount, SqlDbType.Int).Direction = ParameterDirection.InputOutput
                Me.SelectNewAnchorCommand = anchorCmd
    Me.BatchSize = 25

    Forgot to include that I set this as well after the anchor command.

    CEDeveloper


    • Edited by CEDeveloper Wednesday, June 12, 2013 3:48 PM Forgot something
    Wednesday, June 12, 2013 3:45 PM

All replies

  • Forgot to add that this would be the first initial sync of 3 tables that consists of less than 50 rows of data.  Hopefully that information will be helpful.  Thanks.


    CEDeveloper

    Wednesday, June 12, 2013 4:38 PM
  • Ok.  I did a little more digging and started storing the data that was passed into the anchor command stored procedure.  I also passed in the sync_table_name value as well.  It looks like the synhcronization never makes it past the first table and I did look at the database it tried to pull the data to and it did pull the data down to the first table, but not the other 2.  It looks to me like the anchor command created the batch count based on all of the rows in the database and is trying to process that one table the number of times the batch count was, which is alot.  My question is does the synchronization work and process only one table at a time?  If it does, in this case do I need to program the stored procedure to develop a batch count based on the number of rows in the table it is currenly working on?  Sorry if this is a stupid question.  This first project involving synchronization.  Thanks for any help.

       


    CEDeveloper

    Wednesday, June 12, 2013 7:54 PM
  • yes, one table at a time.
    Thursday, June 13, 2013 12:48 AM
    Moderator
  • Thanks.  I went through and coded the stored procedure that gets the values for the anchor command for batching and desiged it around the table to determine the max anchor value, new anchor value and batch count.  This seems to be working good, although now I'm getting a different error.  ""Unable to read data from the transport connection." "An existing connection was forcibly closed by the remote host".  Do you have any ideas to what may be causing this issue?  Thanks for your help.


    CEDeveloper

    Thursday, June 13, 2013 7:12 PM
  • enable wcf tracing so you can see whats causing the error.
    Friday, June 14, 2013 1:08 AM
    Moderator
  • Ok.  I enabled tracing on both the server and the client and both logs did not show an error.  The only thing I've noticed is that the batch of tables that it is syncing when the sync fails are taking a parameter.  Looking at the server trace log I don't see that parameter and the value it was passed in the commands it's calling to do the sync.  I checked to see if the builder showed the parameter being added to it before calling toSyncAdapter() and it does have it set correctly.  I also checked the client code to make sure I am adding the parameter to the syncAgents configuration before I call Synchronize and I am.  Could this be what the problem is and what could be causing the paramter to not be there?  Thanks for your help.

    CEDeveloper

    Saturday, June 15, 2013 6:18 PM
  • is that a WCF trace or a Sync Framework Trace?
    Monday, June 17, 2013 1:09 AM
    Moderator
  • It was a Sync Framework Trace.  I finally figured out that problem.  It was the application pool associated to the WCF service.  I removed all of the reset parameters that were set in it.  It was apparently resetting during the sync process.  I do have a question about batching though.  Where I have filters applied to some of the tables I'm trying to sync, there seems to be alot of batches being pulled for a table that may produce little or no data.  For example, I'm syncing our customer table, but filtering by state, the way I have the batch anchor command procedure setup would determine the number of batches on the entire customer table of lets say 1000 rows of data.  My batch size is set at 100 so my command says that we need 10 batches of data.  Now with the filter applied for a certain state, the client really only needs 20 rows of data.  So now my sync provider will go back to the database 10 times to retrieve those 20 rows of data, when really only 1 trip could have been needed.  I think that is why the application pool was resetting, because the program was making lots of trips to the database, because of the small batch size and the database I'm working with is really large.  Is there a way to code for this or am I way off in left field?  Thanks for all your help JuneT.

    CEDeveloper



    Monday, June 17, 2013 1:46 PM
  • put them on separate sync groups.
    Tuesday, June 18, 2013 5:14 AM
    Moderator
  • Hey JuneT.  The tables I'm syncing right now have not been added to any SyncGroup.  I think I got it figured out for the most part now.  I ended up storing the filter clauses in the database and pulled these out based on the table that required a new anchor value and determined the new ancor value by determining the rows that would be returned by the filter.  One more question.  It is still taking a long time to do this first sync.  It looks to me like the sync first inserts the rows, but then it goes back and performs updates on all of the rows although the data wouldn't have changed from the initial insert.  Have I configured something wrong or is that normal?  Thanks for your help. 

    CEDeveloper

    Wednesday, June 19, 2013 7:26 PM