Updating table data - performance issue RRS feed

  • Question

  • I would like to know the best approach for updating data in a SQL Server database table for the following circumstance.

    We have a legacy system (not running on SQL Server) whose database can NOT be read directly.  We have a "reporting" SQL Server database whose database tables mimic those within the legacy system.  This reporting database is effectively "read only" as far as the reporting application is concerned, but obviously needs to have up-to-date data.

    Within the legacy system, if someone makes a change to the data then this is logged. The log file contains:
    • The name of the table affected
    • The type of change (update, insert, delete)
    • the latest values for all the fields
    I can access this log and use it to update my SQL Server database tables.

    However, because the log file contains a series of update, insert and delete commands, it is of course vital that these are run in chronological sequence - I obviously can't run all the inserts followed by all the updates etc as this would lead to corruption of the data.

    This means that (as far as I can see) I can't take advantage of single-operation, non-logged bulk operations.

    So, I think that I have two alternative approaches.
    1. Using an ADO.NET command object, I send a series of parameterised command statements, one for every single row in the log file.  This would result in a very large number of slow network calls.
    2. I retrieve a datatable from the destination database (containing all existing records) and change it locally for every single row in the log file.  I then do a single slow network call to update the database table.  The problem with this is that I don't yet know just how big these datasets might be (probably not too big to load into memory, but that's a guess)
    Are these the only approaches available to me?  What would be the preferred approach?

    Many thanks in advance


    Wednesday, January 14, 2009 4:05 PM

All replies

  • Consider looking at SqlClients Table-Valued Parameters.
    Table-valued parameters provide an easy way to marshal multiple rows of data from a client application to SQL Server without requiring multiple round trips or special server-side logic for processing the data.

    You can load the data into a DataTable and put that DataTable as the parameter value.

    My thought is you can call a stored produced on the server side which would switch on one of the parameter values to make the appropriate insert/update/delete command.

    This posting is provided "AS IS" with no warranties, and confers no rights. Please remember to mark the replies as answers if they help.
    Friday, January 16, 2009 2:32 AM