none
dataset scalability RRS feed

  • Question

  •  

    I have been working a bit with ADO.NET with C# 2005 and have a few questions to ask, as far as best practices in terms of scalability.

     

    I have created a SQL server database to support this data for a lead management system. (a table each for customers, appointments, and employee's comments linked with a foreign key to the customer table.

     

    So the Windows C# application allows the user to search the database for a last name, a lead# or a phone number, and eventually do updates and inserts.

     

    Right now, a datareader reads the data into a list box.

    The onselectedindexchange event handler updates the lead detail into  a series of textboxes to display the information, however the application needs to get the data from somewhere (should it be a dataset, or an individual SELECT query to the database)

     

    My concern is what happens when this program supports 30,000 records with 200 simultaneous connections. What can I do to maximize performance? do I just limit the amount of results?

     

    would a datagrid be any more advantageous over a list box (update/insert functionality is not required in this context).

     

    what happens if 200 people all at once start the app at 9:02 am and each query the database for 30,000 records? (and each storing 200 copies of 30,000 records in memory)

     

    I have code snippets but this is more of a big picture question, rather than about syntax. Any thoughts or experience would be very helpful

     

    Thanks,

    Matt

    crash700@comcast.net

    Wednesday, October 3, 2007 2:37 AM

Answers

  • 200 users with 30,000 records is a small application and user capacity so if you use clean queries or stored procedure you will not run into issues.  What I mean is write separate queries for insert, update, delete and read and use DataReader for most read only use dataset for write operations.  Hope this helps.

     

    Thursday, October 4, 2007 3:20 PM
  • I think 30,000 records is way too many to send over the wire at a time.  You should plan scalability from the beginning.

    You can setup a paging scheme using stored procedures to return data based on letter, date, number of records, anything that will reduce each query to a manageable number.

    Here are some articles describing what other shave done.

    http://www.asp101.com/articles/gal/effectivepaging/default.asp
    http://www.dotnetjunkies.com/Article/975BE770-E5DC-4610-870B-A82BDB9B8845.dcik
    Thursday, October 4, 2007 5:51 PM

All replies

  • 200 users with 30,000 records is a small application and user capacity so if you use clean queries or stored procedure you will not run into issues.  What I mean is write separate queries for insert, update, delete and read and use DataReader for most read only use dataset for write operations.  Hope this helps.

     

    Thursday, October 4, 2007 3:20 PM
  • I think 30,000 records is way too many to send over the wire at a time.  You should plan scalability from the beginning.

    You can setup a paging scheme using stored procedures to return data based on letter, date, number of records, anything that will reduce each query to a manageable number.

    Here are some articles describing what other shave done.

    http://www.asp101.com/articles/gal/effectivepaging/default.asp
    http://www.dotnetjunkies.com/Article/975BE770-E5DC-4610-870B-A82BDB9B8845.dcik
    Thursday, October 4, 2007 5:51 PM