locked
Performance Improvements of using ThreadPool are lost on Another Machine

    Question

  • Hi All,

    I had taken up the task of improving performance of a multi-threaded system. By switching to the use of a ThreadPool to generate threads, I could see that there was a measurable amount of improvement in the output of the threads.

    The development was done on a machine that has 8 gig Ram, and Intel Core i5-2430M CPU @ 2.40GHz. But when I moved the application to test-server, which has 4 gig Ram, and Intel Xeon CPU E5405 @ 2.00GHz, I am not able to get that performance improvement. 

    I am testing by running both, with and without ThreadPool on both machines.

    What might be the reason?

    TIA

    Tuesday, August 21, 2012 5:17 PM

All replies

  • It depends a lot on the application in question.  If you're memory bound, the lack of RAM might be the problem.  The other machine may have slower memory, and definitely has a slower processor, so depending on what synchronization you're using, there may also be issues involved there.

    What is your "work" doing, and how are you testing the ThreadPool, and what are you comparing it to?


    Reed Copsey, Jr. - http://reedcopsey.com
    If a post answers your question, please click "Mark As Answer" on that post and "Mark as Helpful".

    Tuesday, August 21, 2012 5:38 PM
  • Hi Reed,

    Thanks for your reply.

    Each Thread is picking up a subscription from a DB table, creating an instance of a stand-alone Exe, and passing the subscription to it to run. The Exe saves its output as a file, and exits. After that, the thread in ThreadPool model returns to the pool, and in no-ThreadPool model it picks up another subscription to run. Yes. the Exe has a pretty memory intensive work to do.

    The comparison here is being done between application being run with and without ThreadPool. Originally the application didnot use ThreadPool. I have added that code. A Config parameter specifies whether to use a ThreadPool or not.

    The code of Exe is unchanged. Also, most of the code in the thread-process is same (including synchronization), whether the threadPool is used or not.

    The comparison was done on DEV machine, and ThreadPool option proved to be better in getting the Exe output. But now on Test-Server the ThreadPool option is not showing any advantage at all.  Why would that be?

    TIA

    Tuesday, August 21, 2012 7:38 PM
  • Is there other stuff running on the server?  If so, it may not have enough resources to see the advantages during your testing.

    Otherwise, the slower CPU (on the server) may cause the marginal benefits to shrink, so you're not seeing as big of an advantage.


    Reed Copsey, Jr. - http://reedcopsey.com
    If a post answers your question, please click "Mark As Answer" on that post and "Mark as Helpful".

    Tuesday, August 21, 2012 8:24 PM
  • The primary reason can be your QA system invoke lesser no of concurrent thread  then your dev system.

    What version of .Net Framework you use? The default number of thread inside thread pool for a process depends on several factors, such as the size of the virtual address space and number of CPU. You can increase it by using SetMaxThreads method of Thread Pool.

    How many CORE or CPU has in your both environment.

    AFIK, Thread Pool spans thread using Hill Climbing (HC) algorithm. This technique is well explained here http://msdn.microsoft.com/en-us/magazine/ff960958.aspx

    If you are using .Net 4.0 you can Task Parallel Library instead of Thread Pool which enjoys the power of Work Stealing algorithm.

    Is this .exe instances are long running? If yes, the rule is simple:

    • Short and fast: Use Thread Pool Thread
    • Long and slow: Use manually created thread

    Lingaraj Mishra

    Wednesday, August 22, 2012 7:54 AM
  • Hi Rkwindows,

    Additional: The OS is another factor:http://research.microsoft.com/en-us/news/features/070711-barrelfish.aspx 

    Best regards,


    Mike Feng
    MSDN Community Support | Feedback to us
    Please remember to mark the replies as answers if they help and unmark them if they provide no help.

    Wednesday, August 22, 2012 2:04 PM
  • Lingaraj, I expect/want less threads to be spawn during high stress period to improve the subscription success rate. That's why I chose to implement ThreadPool model. But that doesn't seem to be working on the testing server.

    Mike, the OS on DEV machine is Windows 7+SP1, and on testing server it's Windows 8 R2 with SP1.

    Both have the same version of .Net 4.0. Is there anything else that I can look into?

    Thanks again.

     
    Thursday, August 23, 2012 7:19 PM
  • Would you not mind to share your benchmark result of both DEV and Test environment? (Both with Thread Pool Enabled)

    Intel Core i5-2430M processor comes with AMD64 / EM64T, Dynamic Acceleration / Turbo Boost and Hyper-Threading which is not present in CPU E5405. CPU benchmark sample score is as below

    Published by http://cpubenchmark.net/cpu_lookup.php?cpu=Intel+Xeon+E5405+%40+2.00GHz

    If throughput of Test server is 15-20 % slower then dev server, I believe it is acceptable because of the nature of hardware configuration.


    Lingaraj Mishra

    Monday, August 27, 2012 11:37 AM
  • Thanks, Lingaraj. The performance in my case is the count of successful agents. Agents that fail due to timeout cause performance decrease. The ThreadPool is supposed to create only the Threads that can run, thereby preventing timeouts. I see that happening on my machine. But on the Windows 2008 test-server machine that doesn't happen.

    The "performance" wouldn't go down with 15% slower CPU if the ThreadPool behaved as it does on my machine. It would take 15% longer to run the Agents, but the Agents wouldn't fail by timing out.

    Someone told me that different .Net Framework DLLs get used on a Server than on a Desktop machine. Is that true? Could it be that the Server is using a different DLL for the System.Threading namespace code?

    Also, the Server is actually a Virtual machine, with shared CPU. Would that cause the difference in ThreadPool behavior?

    TIA

     


     
    Thursday, September 06, 2012 5:03 PM
  • Hi Rkwindows,

    As far as I know, the System.Threading has no such difference.

    >>The "performance" wouldn't go down with 15% slower CPU if the ThreadPool behaved as it does on my machine. It would take 15% longer to run the Agents, but the Agents wouldn't fail by timing out.

    I don't agree with this: Since your main process just wait a small moment, and I think the period is the same on both machines, so on the slower machine, it times out, but on the faster machine, it works well.

    Best regards,


    Mike Feng
    MSDN Community Support | Feedback to us
    Please remember to mark the replies as answers if they help and unmark them if they provide no help.

    Friday, September 07, 2012 7:14 AM
  • Mike,

    I do understand that the 15% slowness will affect each agent that has started by each Thread in the ThreadPool. But if the ThreadPool is sensitive to resource availability, it should create less number of Threads, thus activatiting less number of Agents, and hence causing less timeouts.

    On my machine I see 30 to 35% improvement in performance. So on the test-server I should see at least 15 to 20% improvement? But I am seeing that both (the options: With ThreadPool and Without ThreadPool) perform the same. How do we explain that?

    TIA

    Friday, September 07, 2012 2:17 PM
  • From a support perspective this is really beyond what we can do here in the forums. If you cannot determine your answer here or on your own, consider opening a support case with us. Visit this link to see the various support options that are available to better meet your needs:  http://support.microsoft.com/default.aspx?id=fh;en-us;offerprophone.

    Trevor Hancock (Microsoft)
    Please remember to "Mark As Answer" the replies that help.

    Tuesday, September 11, 2012 6:49 PM