locked
.NET n-tier performance.. RRS feed

  • Question

  • User1359245325 posted

    Trying to find the right forum to get an answer for this question...

    It is for a .NET application, with a asp.net web service And/Or asp.net web site, And/Or silverlight application consuming a web service.. It talks to a SQL 2005/2008 database.

    I have a requirement for the following... 

    2000 users application users

    approximately 500 concurrent users

    15,000 website hits a month, and

    50,000 queries via the website a month (data retrieval from sql db).....

    Using an N-tiered deployment... what wld be required? will IIS6 handle the hits? or will i need a load balancer and a 2nd/3rd server?

    Thursday, March 11, 2010 1:39 AM

Answers

  • User-967169866 posted

    Those requirements are very low.  I'd go so far as to say a database and one web server could handle that without a problem.  If you want to add two web servers for redundancy that would be a good idea if your application has a uptime requirement. 

    15k hits a month is really low.  That's approximately 500 hits a day, and a single server could handle that many in a couple minutes.  50k queries a month is also nothing to be concerned about.  

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Thursday, March 11, 2010 5:16 AM
  • User-952121411 posted

    When it comes to infrastructure concerns / metrics in regards to web or SQL server capabilities, I tend to look to the 'TechNet' forums for that guidance.  There are a lot of hardware guys on that forum that can more accurately give good data on those types of questions.  Check it out here:

    Microsoft TechNetguidance Forums:

    http://social.technet.microsoft.com/Forums/en/categories/

    You can get a lot of good information there in determining needs based on your numbers, processors, memory, etc in regards to clustering SQL Server, web farms, load balancing, using a single server etc.

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Friday, March 26, 2010 4:23 PM

All replies

  • User-967169866 posted

    Those requirements are very low.  I'd go so far as to say a database and one web server could handle that without a problem.  If you want to add two web servers for redundancy that would be a good idea if your application has a uptime requirement. 

    15k hits a month is really low.  That's approximately 500 hits a day, and a single server could handle that many in a couple minutes.  50k queries a month is also nothing to be concerned about.  

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Thursday, March 11, 2010 5:16 AM
  • User1359245325 posted

    hmm., thank you for your answer.


    uptime guarantee is a must, but these servers host a lot of mission critical "stuff".


    Any idea on what would be realistic limits? on 1 web server + 1 db server? as far as users, hits, concurrent users, query, etc?

    Thursday, March 11, 2010 6:17 PM
  • User-952121411 posted

    When it comes to infrastructure concerns / metrics in regards to web or SQL server capabilities, I tend to look to the 'TechNet' forums for that guidance.  There are a lot of hardware guys on that forum that can more accurately give good data on those types of questions.  Check it out here:

    Microsoft TechNetguidance Forums:

    http://social.technet.microsoft.com/Forums/en/categories/

    You can get a lot of good information there in determining needs based on your numbers, processors, memory, etc in regards to clustering SQL Server, web farms, load balancing, using a single server etc.

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Friday, March 26, 2010 4:23 PM
  • User-525215917 posted

    approximately 500 concurrent users

    15,000 website hits a month

    Do you really want to say that these 500 users come to your site at exactly the same time everyday? Also your calculation shows that they make one bang-visit per day. Are these numbers accurate at all?

    Friday, March 26, 2010 11:05 PM
  • User1359245325 posted

    Yes, they get there at 9 am, and access the application at the exact same msecond. :) .............


    8 am., 10 users using the system 9 am 100 users using the system 10 pm 400 users using the system 2 pm.. 500 users using the system., its a web application... ppl are doing "things"., i.e. concurrent users.  its an intranet app.. not a publicly open system., 

    numbers are accurately low.. :)


    PS. im just trying to respond to a rfp requirement... i copy paste their requirements., need to justify how my WCF n-tiered solution will support it.

    Saturday, March 27, 2010 12:27 AM
  • User-525215917 posted

    What kind of data they will access? Are there a lot of common data shown to all users or are they all seeing very different data?

    Saturday, March 27, 2010 5:02 AM
  • User1359245325 posted

    Its a security., i.e. law enforcement application., 


    record search, data entry, syncing data off devices., generation reports, data automation (transmitting xml records)... record search will probably be limited to top 100...

    Saturday, March 27, 2010 6:25 AM
  • User-525215917 posted

    If there are output that changes slowly then you may consider preparing and caching it before users come online. On service side I suggest you to use some data caching mechanism. This way you keep the load of server lower. I use mainly NHibernate as O/R-mapper and if it must work under heavy fire I will enable second-level cache. Of course, in these scenarios you need SQL optimization too. 

    The other thing to consider is output caching. If there are user controls with static content or that are representing data that must not be up-to-date at every second you can cache their output. If users have no problem with data that is 5 or 10 or 15 minutes old then just cache output of these controls for appropriate time range.

    When you are developing your database I suggest you to monitor it and perform regression and performance test to find out all major bottle necks. If your database is highly optimized you have much less problems to worry about.

    These are my first suggestions.

    Saturday, March 27, 2010 6:48 AM
  • User1359245325 posted

    I am looking for a technical answer., the first response i got was "15k hits a month is really low.  That's approximately 500 hits a day, and a single server could handle that many in a couple minutes.  50k queries a month is also nothing to be concerned about"... i need more info in that context., as to "Why" the server would be able to hand the load,. 15k hits is low, but i need to back that claim up with white papers / case studies., 


    Since this is a response to an RFP., in answers, the claim must be justified and described... use of an o/r mapper, caching,. etc., can relate to how you intended to develop the system/software., but the question being asked has to do more with basic n-tiered setup., as in., "why" do i know or think that 15k hits a month is low and that "500" concurrent users/queries will not kill a sql server...

    regression., performance test., etc are great in a self deployed real world app... in a secure, private app., they want to know details before the decide they want your system., testing against their actual data is not possible., and dummy data needs to be exacted from their in-production environment.


    Saturday, March 27, 2010 2:42 PM
  • User-525215917 posted

    In this case make performance tests with dummy data. There are many tools available that are able to simulate requests by many users at same time. Use performance counters to monitor web server, operating system, disk I/O, and database server. Write simple prototype application to see how different parts of your system perform. This helps you to get more exact estimates. And, of course, you can refer to your tests.

    If you need just one answer to "why" then sorry - there is no such answer. All the answers are given in context of technical features you are planning to use. There is also no exact answer to question like "if I have 500 concurrent users then is n-tier app the best choice - please prove me it in numbers?". Nobody knows the amount of data, your database structure, the logic of business operations etc. The best shot you have is to try out and estimate requirements based on your tests. This is what I think.

    Of course there are different whitepapers available but without testing these whitepapers are way too common to decide something. Okay, if you have bunch of business, sales and marketing doods asking for those papers, then it is easy - give them something. This way or other they don't understand :)

    Saturday, March 27, 2010 2:52 PM
  • User1359245325 posted

    I like to call rfp's "Fool requirements".

    Most of the time, people sitting in back offices, who never will and never have used the system, get together, look at marketing information and say "lets take all of this and make it into an RFP"; and that is what i have in my hand.

    So a reference to white papers would be idea., even a comparison of  reading 20 8000 char fields on sql server takes x seconds., i know data is out there,. will look for it in a bit

    Saturday, March 27, 2010 3:02 PM
  • User-525215917 posted

    Here is the list of SQL Server 2008 white papers. And here are white papers published under Technet. Both of these links point to list of white papers. Take a look, maybe there are something you can use.

    Saturday, March 27, 2010 3:08 PM
  • User1359245325 posted

    i browsed those earlier... I found http://www.microsoft.com/casestudies/Case_Study_Detail.aspx?CaseStudyID=4000001487 


    there is a whole bunch of case studies., they might hold my copy-paste answers

    Saturday, March 27, 2010 3:12 PM
  • User-525215917 posted

    Case studies can be handled as examples that yes - this technology or this methodology works for them in their context. White papers may be more stronger arguments to show because they are directly from vendor ;)

    Saturday, March 27, 2010 3:37 PM