locked
Multithreading Analysis RRS feed

  • General discussion

  • Hi all,

        I need some expert help about my scenario and code..

    I am receiving some files from external devices and put as xml file in one folder. From this folder i am processing the data to the database. When i got file i added in to the genric list using filewatcher control. On the timer i loop this list and check whether file is locked or not. If not i process the data and remove from the list. I am using ThreadPool.QueueUserWorkItem method.

    Please analyse the code the and tell me what are the disadvantages of this style if any. This code still working in my clients but rarely i got deadlock and sometime last xml is processed before previous one..

    This is a windows service and the code is below:

    protected override void OnStart(string[] args)
        {
          try
          {
              watcher = new FileSystemWatcher(readPath, "*.xml");
              watcher.Created += new FileSystemEventHandler(OnFileCreated);
    
              tmrProcess = new System.Timers.Timer(1000);
              tmrProcess.Elapsed += new System.Timers.ElapsedEventHandler(tmrProcess_Elapsed);
    
    
              //Create Generic List which includes the xml filenames
              lstFiles = new List<String>();
    
              // Begin watching
              watcher.EnableRaisingEvents = true;
              tmrProcess.Enabled = true;
          }
          catch (Exception er)
          {
            EventLog.WriteEntry("service1", er.Message, EventLogEntryType.Error);
            this.Stop();
          }
        }
    
    
        private void OnFileCreated(object sender, FileSystemEventArgs e)
        {
          try
          {
            lstFiles.Add(e.FullPath);
          }
          catch (Exception er)
          {
            EventLog.WriteEntry("Service1", er.Message, EventLogEntryType.Warning);
          }
        }
    
        void tmrProcess_Elapsed(object sender, System.Timers.ElapsedEventArgs e)
        {
          try
          {
            tmrProcess.Enabled = false;
            for (int i = 0; i < lstFiles.Count; i++)
            {
              String fileName = lstFiles[i];
              FileInfo file = new FileInfo(fileName);
              if (!IsFileLocked(file))
              {
                lstFiles.Remove(fileName);
                ThreadPool.QueueUserWorkItem(new WaitCallback(ProcessData), fileName);
              }
            }
    
            tmrProcess.Enabled = true;
          }
          catch (Exception er)
          {
            EventLog.WriteEntry("Service1", er.Message, EventLogEntryType.Information);
    
          }
        }
    


    In the process i read the data from the file and call one function say DataTransaction() which uses stored procedure to process the data. Sometime i got deadlock from DataTransaction function.

     

     

     

    • Changed type akhilrajau Wednesday, July 20, 2011 7:02 AM This is discussion
    Wednesday, July 20, 2011 6:58 AM

All replies

  • You would need to post the deadlock information for it to be analysed/resolved.

    My suggestion is removing the call to QueueUserWorkItem and replace with a single threaded implementation, unless you are facing a backlog of processing.

    As you did not post ProcessData I cannot comment as to whether the SQL or the C# code is the bottleneck. If it is C#, stick with your QueueUserWorkItem implementation and place a mutex around the call to the SQL function within ProcessData.

     

    You still have a race condition where a virus scanner may open the file after it is written to the directory and closed, but this problem is not mentioned in your post.

    Wednesday, January 4, 2012 12:27 PM
  • Thanks friend. I will try to add the error message as soon as possible.

    Currently it is running fine and sometime shows timeout exception from sp because the stored procedure do heavy process like insert data into log table and validate the data and insert into main table. In the nest version i try to split the sp and simple logging is handle in this service and other main table insertion can be done using SQL JOB. What is your suggesion?

     

    ProcessData is c# code only and i will post the code tomorrow. I dont have depth knowledge in multithreading so queueUserWorkItem is best suited to me and running successfully now without any major issues( except some rare case like this time out and dead lock occured errors)

    Wednesday, January 4, 2012 4:27 PM
  • If you are getting timeouts in SQL you will need to increase the command timeout :  http://msdn.microsoft.com/en-us/library/system.data.common.dbcommand.commandtimeout(v=VS.100).aspx

    or improve the performance of your stored procedure, or make sure there is no contention for resources.

    If you are deadlocking in your stored procedure (chosen as deadlock victim...) then you need to rework the stored procedure so that you don't get deadlocks. This can be done by controlling the order of locks eg: post A, then Post B, then Post C, and never post C,B,A.

     

    Wednesday, January 4, 2012 9:26 PM