none
Process cannot access the file because it is being used by another process

    Question

  • I have an application that I am triggering to run from a SQL Server.  The server basically loops through a queue and triggers the application to write data to a txt file.  However because the queue is so large I am triggering multiple instances of the same application to write to files.  Because of this there is a slim chance (1 in 500) that one application can have the file open when another tries to open it.  Thus i get this error:
    The process cannot access the file '\\wfdip\DIP2\PSBlotter\DIP\P&S-_-07-31-2009.TXT' because it is being used by another process.
    Does anyone have any suggestions on how to make my application wait until the file can be opened instead of erroring out?
    • Edited by homeguards Monday, August 03, 2009 2:55 PM
    Monday, August 03, 2009 2:54 PM

Answers

  • Thanks for the quick reply, do you happen to have any example code of what that might look like?  I am unsure of how to catch a specific error.


    Something like this :

    int waitTime = 500;
    int repeats = 5;

    while (repeats > 0)
    {
        try
        {
            WriteToFile();
            repeats = 0;
        }
        catch (IOException)
        {
            // IOException caught, waiting...
            Thread.Sleep(waitTime);
            repeats--;
        }
    }


    http://blog.voidnish.com
    • Marked as answer by homeguards Monday, August 03, 2009 3:45 PM
    Monday, August 03, 2009 3:34 PM

All replies

  • The simple way is to catch the exception, wait for a few milli-seconds, then try again - and repeat this till you get access.

    A more involved but possibly cleaner way is to have some sort of mechanism where there's a shared list of open files (accessibly to all instances of the log writing app). So an instance would check this list to see if the file is locked, and if so wait on a handle or subscribe to a custom lock-released event. How you maintain this list is up to you - you could even write a simple wcf service to do that I guess.
    http://blog.voidnish.com
    Monday, August 03, 2009 3:05 PM
  • Thanks for the quick reply, do you happen to have any example code of what that might look like?  I am unsure of how to catch a specific error.
    Monday, August 03, 2009 3:25 PM
  • Hi,

    One way is to catch the IO exception and start a fileSystemWatcher with required notifiers and your filename as filter for the watcher.

    for ex.
    You can subscribe OnChanged Event and call you normal function (which opens and writes to the file) in your event handle and unsubscribe the event.

    Regards,
    Vinil;
    Monday, August 03, 2009 3:26 PM
  • Thanks for the quick reply, do you happen to have any example code of what that might look like?  I am unsure of how to catch a specific error.


    Something like this :

    int waitTime = 500;
    int repeats = 5;

    while (repeats > 0)
    {
        try
        {
            WriteToFile();
            repeats = 0;
        }
        catch (IOException)
        {
            // IOException caught, waiting...
            Thread.Sleep(waitTime);
            repeats--;
        }
    }


    http://blog.voidnish.com
    • Marked as answer by homeguards Monday, August 03, 2009 3:45 PM
    Monday, August 03, 2009 3:34 PM
  • On the left side you have large number of items.

    In the middle, a number of application instances.

    On the right you have a single resource (one file) that has to be written to by the left, but you're delegating to the middle and wanting to syncronize access to the right.

    If you provide a synchronization mechanism for the app instances to write to the file, you have just moved the problem to the app and gained nothing. Most of the mechanisms you might try using to achieve sync (like the suggestion above) are likely to result in suffocated, orphaned processes.

    I think you can get rid of the middle man using a .NET Stored Procedure to write the queue (I'm not 100% on that, I don't know if you can do I/O in a .NET SP).

    You might have to move this queue to another instance of SQL Server (on another machine) which does nothing but process these log requests.

    Alternately you might write to multiple files. If you can postpone mergeing the logs on demand (only when they need to be read), this might work well.

    Monday, August 03, 2009 3:47 PM