none
PROBLEM: ASP.NET/C#/SQL Server 2000 database (6 million records) RRS feed

  • Question

  • Hi Everybody,

     

    I designed a search page in ASP.Net/C# that allows the user to search on any combination of columns in a sql server database table. 

    The issue I am having is with the speed (obviously ).  I put a limit of 50,000 on the rows that can be returned by querying for a row count before retrieving the data. This way if the row count is over 50,000 it displays a message to the user stating that they need to refine their search criteria before it tries to query (Stored Procedure)  for the actual rows. In the end however, the search page is still taking 30 seconds to render when 49,712 rows are returned. 

     

    The user's would like all of the results returned on one page (it is one huge web page) and they need to be able to sort the results. 

     

    By running sql profiler I was able to determine that the query itself isn't taking much time at all.  I know that part of the problem is that I am using a dataset and repeater when I should be using a sqldatareader. 

     

    Another part of the problem is the server itself which I can't do anything about. 

    Also, the program has around 100 users accessing it all day every workday.

     

    If anybody has any suggestions, I would love to hear them. 

    I have never worked with a database this large. 

    Any help would be appreciated...

     

    Thanks!

    Thursday, August 30, 2007 8:42 PM

Answers

  • Another approach I have used is to "fake it". That is, fetch a 2-3 forms (or pages) worth of rows from the rowset and display the contents. While the user is staring at the result you quietly fetch another couple of pages ahead of the last page. As they move forward you continue to requery and fetch more rows ahead of them. This could be called the "lay track as you get there" approach which is easy to implement until they click on the "last row" button--in which case you reverse the logic and get the TOP N with a Descending sequence to get the last N rows. This is very easy to do with Windows Forms applications but can be done with ASP as well--but my expertise lies with the former.

     

    Yes, this might be a technical problem with the mechanism to build the DataTable or the HTML (which will take quite a bit of time), but if this was working at 100% efficiency, I still think there would be serious performance problems--especially as you scale.

     

    I have also worked with large corporate and government organizations that seem to be open for output only (they only know how to talk and not listen). For the most part it has taken personal courage and financial independence to get my point across. Sure, not all employees have the ability to tell the boss that it can't be done. Over the years, I found that I had to educate the management/architect/decison-maker to help him or her understand the options and consequences. This has taken months to years. All too often, it's not the volume of data that's important--it's the intelligence of the answer that matters. SQL Server and your application can deliver both in varying degrees while you balance perfornance and usability.

     

    I've also found that many organizations don't listen to their in-house developers or staff (no matter how brilliant) but they do listen (all too frequently) to outside consultants brought in to rubberstamp their design. What they need IMHO is an outsider to evaluate the design on a dispassionate, objective bassis and tell them that their design (in which they have a heavy personal or career investment) is 100%$ on target, sucks (that's a technical term) or somewhere in between. I and others are here to provide that second opinion if you want to call on us.

     

    hth

    • Marked as answer by VKenner Wednesday, December 21, 2011 2:19 PM
    Thursday, September 6, 2007 8:14 PM
    Moderator
  • I wanted to reply to everybody that responded to this thread.  It's been a while but I eventually convinced everybody that this was just due to the number of records and the rendering of the page.  They came to accept the load time for that application.  After that, they requested that I build another application very similar to the first but with different data.  This time, I persuaded them to page their results and all is well.  Thanks everybody.
    • Marked as answer by VKenner Wednesday, December 21, 2011 2:19 PM
    Friday, June 11, 2010 4:26 PM

All replies

  • Can you elaborate (or get the business to elaborate) on why the end user might want to view and sort 50k records?  At first blush, it does not seem reasonable that a user would do anything meaningful with 50k records on one monolithic page.

     

     

    Thursday, August 30, 2007 10:06 PM
  • I heartily concur. If you're passing rows to the client for processing, I would consider pre-processing rows on the server and sending an "intelligent" product to the user. Yes, I expect much of your time is spent sending the rowset to the IIS server which has to populate the DataTable (I assume you're using a Fill or Load). Yes, I would approach the problem by requiring a far smaller maximum number of rows to process.
    Friday, August 31, 2007 4:33 AM
    Moderator
  •  

    Well, I work for the government.  I can't say what information the search is retrieving.  I agree with the two of you 100% and have told the users of the program that it shouldn't display everything at once.  Yes, I understand the fact that it is not practical to return 50k rows as well. I agree with you.  However, this is the way they want it and is the lowest I could get them to go for a row cap.  Unfortunately, I really have no say in the matter.  I just build it.  I do not see any way to make this any faster since it is the building of the table that takes so long.

     

    Thanks for replying!  Any other suggestions are still welcome...

    Tuesday, September 4, 2007 1:20 PM
  • One thing you might do is look at how the HTML is rendering.  If this data is being rendered into a huge html table then that could be a major part of the problem.  I have seen IE have problems with attempting to render large tables.

     

    Given that, you might look at rendering the data witha series of divs, or perhaps a series of smaller tables.

     

     

    Tuesday, September 4, 2007 3:55 PM
  • Another approach I have used is to "fake it". That is, fetch a 2-3 forms (or pages) worth of rows from the rowset and display the contents. While the user is staring at the result you quietly fetch another couple of pages ahead of the last page. As they move forward you continue to requery and fetch more rows ahead of them. This could be called the "lay track as you get there" approach which is easy to implement until they click on the "last row" button--in which case you reverse the logic and get the TOP N with a Descending sequence to get the last N rows. This is very easy to do with Windows Forms applications but can be done with ASP as well--but my expertise lies with the former.

     

    Yes, this might be a technical problem with the mechanism to build the DataTable or the HTML (which will take quite a bit of time), but if this was working at 100% efficiency, I still think there would be serious performance problems--especially as you scale.

     

    I have also worked with large corporate and government organizations that seem to be open for output only (they only know how to talk and not listen). For the most part it has taken personal courage and financial independence to get my point across. Sure, not all employees have the ability to tell the boss that it can't be done. Over the years, I found that I had to educate the management/architect/decison-maker to help him or her understand the options and consequences. This has taken months to years. All too often, it's not the volume of data that's important--it's the intelligence of the answer that matters. SQL Server and your application can deliver both in varying degrees while you balance perfornance and usability.

     

    I've also found that many organizations don't listen to their in-house developers or staff (no matter how brilliant) but they do listen (all too frequently) to outside consultants brought in to rubberstamp their design. What they need IMHO is an outsider to evaluate the design on a dispassionate, objective bassis and tell them that their design (in which they have a heavy personal or career investment) is 100%$ on target, sucks (that's a technical term) or somewhere in between. I and others are here to provide that second opinion if you want to call on us.

     

    hth

    • Marked as answer by VKenner Wednesday, December 21, 2011 2:19 PM
    Thursday, September 6, 2007 8:14 PM
    Moderator
  • I wanted to reply to everybody that responded to this thread.  It's been a while but I eventually convinced everybody that this was just due to the number of records and the rendering of the page.  They came to accept the load time for that application.  After that, they requested that I build another application very similar to the first but with different data.  This time, I persuaded them to page their results and all is well.  Thanks everybody.
    • Marked as answer by VKenner Wednesday, December 21, 2011 2:19 PM
    Friday, June 11, 2010 4:26 PM