none
ADO.NET, Datareaders and VB.NET Speed Issues RRS feed

  • Question

  • Hi,

    I'm currently writing a standard 3-tier (gui/business layer/data layer) winforms application in Visual Studio 2008.

    I have a form which has to load around 20 different sets of un-related data when it opens. My data layer returns seperate data readers to the business layer and each routine uses seperate sql connections (it opens them, gets a data reader, and then closes it).

    Now the problem i have is the speed the data is being retrieved. For a relativly small amount of data (around 100 rows of data combined) it's taking say 10 seconds to retrieve, parse into seperate collections and then paint to screen.

    My form is using controls from Infragistics and Component One and is quite heavy anyway. I'm using SQL Server 2000.

    Question is; Is the fact i'm opening and closing the db 20 times causing the speed issues and if so is a global connection object way to go? I've read up on the web and the concensus seems to be it's a bad thing.

    Any help would be greatly appreciated.

    Cheers

    Rob


    Thursday, February 28, 2008 9:53 AM

All replies

  • I suggest enabling connection pooling.  This accomplishes the equivalent of a global connection object type of solution without the inherent issues when multiple queries are trying to execute at the same time.  Out of curiosity, are you running all these queries serially, or in parallel?  Perhaps threading the query execution could improve performance a little bit more.

    Thanks,

    John
    Thursday, February 28, 2008 7:09 PM
  • You definitely want connection pooling, though it should actually be enabled by default. Just be sure that you're using the exact same connection string for every connection.

     

    Are all of these queries being issues around the same time? If so, you might be forcing the system to create multiple 'physical' connections to the database, rather than drawing them from the pool. That would explain the performance problem. The solution would be to issue a single batch sql command on a single connection, rather than seperate queries, though that might be challenging depending on your application's architecture.

     

    If you're using SQL 2005 or later as your back-end, you might also be able to switch to using multiple active result sets (MARS) with a single connection.

     

    - Dave

    Monday, March 3, 2008 7:49 PM
  • I nailed down the performance issue to the fact this data was being essentially load across 3 layers. The db passing up data to the business logic layer which passed it up to the gui.

    I went down the path of making the business layer a logical (rather than physical) layer of the gui layer. This had a massive positive impact and i'm getting the speed i require.

    Thank you for your comments though - greatfully received.
    Sunday, April 6, 2008 1:23 PM