Updating fields from multiple processes simultaneously RRS feed

  • Question

  • I'm writing a system that will have multiple clients updating statistics fields in a table when processing a file.

    For example, the row in the table will have file_id, records_processed and total_processing_time as some of the fields.

    Each client, on completing its batch of records, will need to add the batch size to the records_processed field, and also add the number of seconds and milliseconds it took to the total_processing_time field.

    Quite a simple task, but I'm concerned about deadlocking as we could have 40 processes trying to update the same fields at the same time, and we really don't want a process to update the wrong value.  And having a process sit and wait is not an option, it just needs to happen and move on.

    Any ideas on how best to do this?  The updates don't have to be done in realtime so we were considering a message queue and a small processing service to do the updates.  It'd be nice to have something more elegant though.
    Wednesday, October 17, 2007 3:16 PM

All replies

  • Message queue is probably good idea, since you could convert multiuser user task into sequential pipeline execution. But if data is inside of the database, then you need transaction to provide safe updates in multiuser environment even if you are not using queue. In this case updates to same data will not happen at same time from multiple users. When multiple requests to update data come to the server and table is locked by transaction, all the requests will wait until they will be applied or timed out. In a case if requests are too frequent, it could lead to timed out updates. I believe you need to do some sort of load test to see if strategy meets expectations.

    Thursday, October 18, 2007 10:42 AM