Friday, May 29, 2009 7:30 AM
Hi all SQL gurus,
We are providing educational training to employees of all ages via intranet application. We have currently one web server (windows server 2003) with one SQL Server 2005 running. we are running our application on classic asp. The database has already coming data from difference sources. there are more complexicities into system. We found that the bottlneck is SQL Server all the time. We recently optimized our application and database, but still it is not satisfactory to client, As users are increasing day by day. and the data is becoming bigger on daily bases.
We are moving our application on .NET (ASP.NET) with more features. We have planned to use a hardware when user hit the website, it will be redirecting user request amoung three webservers (like NLB) Network Load Balancing mechanizm. Each web server will have its own sqlserver. All the three SQL Servers will be sharing data among them via Peer to Peer replication (Like microsoft architecture). I have to further make our OLTP (online operational data) short by keeping 1 month into operational database and rest will be moved to another sql server on daily bases - Like Archiving. we will have reporting options on our website for two different type of users, employees and administration. Each Employees would be able to see his/her all reports by quering in operational and archived data both. Adminitration will be able to see all employees' data from begining.
I would need help with following:
1- First let me know if this architecture has any drawback let me know. if you think it can be further improved - please share.
2- If there is any other better solution that we can adopt right now, it is good time to think about it immediately, otherwise we will be late.
3- When archiving data how do i make queries? to run query from both OLTP and archived database? How? On reporting side Users has option to input date criteria for last nth days to fetch records, Like last 10 days or last 100 days etc.
4- Archiving technique is a good solution or not. I have implemented an SP to move data from OLTP to archiving database but it is taking too much time. I was looking at http://msdn.microsoft.com/en-us/library/ms190923(SQL.90).aspx and it says in very begining that
"To bulk-transfer data from one Microsoft SQL Server database to another, data from the source database must first be bulk-exported into a file. The file is then bulk-imported into the destination database"
I request for wise and good direction to success If some one guide me on my solution/proposal, I will be thankful.
Please response as soon as you can.
Sunday, May 31, 2009 8:13 AM
I suspect your question does not have a simple answer that can come from the information you supplied. A couple of things I would suggest you look at are
· Use SQL Profiler to find out which queries are taking up most resource. See if you can tune the tables, clustered indexes, and secondary indexes to enable these queries to execute with less resource. Deciding which indexes are clustered is one of the most important performance considerations you can make.
· Consider using Analysis Services to create a cube, from which your report queries should be executing with much less resources (cpu and IO). This is such an obvious technology to use, I would ask you why aren't you using it?
· Look at your largest tables. Is it possible to make them skinnier? Ie, could you normalise the data so that large columns are held out in satellite tables, and could you use more concise data types without loss of data?
· To a lesser extent, consider disabling parallel query (query decomposition) so that any one query will only use one thread. This will tend to mean that some queries will take longer, but there should tend to be less total resources consumed.
Hope that helps. If you want to see a reasonable size close to real-time DW with ad hoc querying see my demonstrations on http://EasternMining.com.au/Sites/Demonstrations The Perfmon dashboard (with ad hoc queries) has about 300 records inserted every second and is available for standard and ad hoc querying. Having said that the web logs dashboard is much easier for people to understand and it is also close to real time.
Wednesday, June 03, 2009 7:41 AMHi Richard,
Thanks for your reply. If you need any further information that i missed to write here, feel free to ask. We have a target of 100,000 online users at time. we already had done lots of exercise on SQL Profiler and other best practice techniques in our SQL and application code that improve performance. After tunning it only support max of 10,000 users at a time which is far from our actual target of 100,000. Actual it is an enterprise application that will give training to all users online at same time, obviously users will need to see their online status/report/results via reporting too.
Our basic requirement is High availability with max of Performance. We are thinking about clustering SQL Server for high availability. Do u think how can we achieve this target, what desing is required (Hardware, software and design strategy)?
Wednesday, June 03, 2009 9:52 AMHi Shamshad,
If you have performed a benchmark and have perfmon/profiler results, then you should be in a good position to know what the bottleneck is and what can be done to relieve it. What is your bottleneck limiting you to 10,000 users? It is cpu, IO, memory, locking, network.....? Each of these bottlenecks will have hardware and architectural solutions (or more accurately, releif).
Thursday, June 04, 2009 9:35 AMHi Richard,
These are all the bottlenecks, CPU, I/O, Network. To fix these issues, we wanted to share the user load among different servers, i.e. Load Balancing (web requests and db response - all needed 100% up and running). I already mentioned my thoughts/plan above and wanted to know the Pros and Cons on this architecture if any. Please go through the questions, issues and let me guide with your suggestions.
Saturday, July 04, 2009 6:53 AMHi Shamshad,I find it hard to believe that running on one classic ASP server with what you consider an optimized SQL Server, that your issues are with SQL Server.First, I've never seen classic ASP handle more that say 50 concurrent users on one machine, much less 10,000. Classic ASP runs in a single threaded apartment, which will always choke before an optimized SQL Server chokes.Also, I've never seen all 3 (CPU, IO & network) simultaneously be a bottleneck on SQL Server.Before you create a maintenance nightmare, you may want to take a closer look at ASP and also post some real-world statistics about your SQL Server. Look at the permon counters for ASP (especially "requests queued"). Also, what is the CPU of SQL? What is the IO? If both are high, you're probably missing some indexes. Also, how many batch requests/sec?