locked
Improving WCF Data Service Performance RRS feed

  • Question

  • Hi..

    I have a SQL CE DB in the client side and SQL Server DB in the server side. The schema of the 2 are not identical and hence i'm using some mappers.

    The objective is to update the Server DB periodicially (Once every 10 seconds or so). the Client DB is updated frequently.

    I have WCF data Service hooked up to EF on the server side, hosted in IIS. No stored procedures .

    Here is my code

     

    For Each ClientObj As PosClientModel.Transaction In Changset
    
       TranObject = TransactionMapper.GetServerObject(ClientObj)
       PosServerSystem.AddToTransactions(TranObject)
       TranListObjects = TransactionDetailMapper.GetServerObject(ClientObj)
    
       For Each TranDetail In TranListObjects
          PosServerSystem.AddToTranDetails(TranDetail)
       Next
    
    Next
    
    PosServerSystem.SaveChanges()
    

    The problem is that the Inserts are simply too slow for the application, it takes an average of 2-3 seconds to insert a Transaction and its associated Detail record (tested with 2 details records for each header).

    The Transaction table has 8 columns and Detail table has 9 columns. No heavy data just char, numeric and date fields.

    The WCF Data Service is being accessed over the public internet, the client will probably use asynchronous methods in the future.

    but apart from these are there any performance improvements that can be considered?


    Sunday, August 22, 2010 8:26 AM

Answers

  • Hi,

    Without knowing where your code spends most of the time, it's impossible to suggest solutions. 2-3 seconds is a lot, so I would probably suspect the network, but you should be able to tell by looking at the server or the client CPU consumption and/or disk IO rates.

    In general, if you update/add multiple entities you can you batching, which will send just one request to the service instead of many requests (one for each modified/added entity). To do this just pass SaveChangesOption.Batch to the SaveChanges method.

    Thanks,


    Vitek Karas [MSFT]
    • Marked as answer by azwaanameer Tuesday, August 24, 2010 1:30 PM
    Tuesday, August 24, 2010 12:54 AM
    Moderator

All replies

  • Hi,

    Without knowing where your code spends most of the time, it's impossible to suggest solutions. 2-3 seconds is a lot, so I would probably suspect the network, but you should be able to tell by looking at the server or the client CPU consumption and/or disk IO rates.

    In general, if you update/add multiple entities you can you batching, which will send just one request to the service instead of many requests (one for each modified/added entity). To do this just pass SaveChangesOption.Batch to the SaveChanges method.

    Thanks,


    Vitek Karas [MSFT]
    • Marked as answer by azwaanameer Tuesday, August 24, 2010 1:30 PM
    Tuesday, August 24, 2010 12:54 AM
    Moderator
  • Hi

    Thanks for your response.

    More than 95% of the time is being taken by the SaveChanges Call. I was under the impression that since i'm calling SaveChanges only once, and adding multiple in the for loop, this would automatically result in a batch insert.

    However after your response i changed this line to the following.

    PosServerSystem.SaveChanges(SaveChangesOptions.Batch)

    And this immeadiately resulted in much better performance. Now inserting a record takes less than 1 second and the more records that are batched,  the less time per record it takes (obviously due to less overhead)

    For now this is acceptable performance.

     

    Thanks

    Azwaan

     

     

     

     

     

     

    Tuesday, August 24, 2010 1:30 PM