locked
Website that work a lot with the DB - the best way RRS feed

  • Question

  • User-1395793064 posted

    Hi,

    I'm in the middle-end of written a big website that work a lot with the db.
    In every call a page runs 5-12 different queries.

    What I did was build an execute class that uses MySqlConnector API, which get string that represent the sql and the parameters and then just execute the query and return DataTable. In addition to that I have a class that hold all the function that call the execute class functions. For example in the class I have getAllProducts, getOldestProducts etc... and in each and every one of them theres sql statements.

    Recently I watch the LINQ video series(How Do I with LINQ - on this site) and it looked good.
    As I sees it my way is very good because it just open connection execute the sql get the result and close the connection.

    The question is what is the best performance wise way to build this site? Is it my way that I use all along? Is it using LINQ? Or working with DataSet/TypedDataSet is the preferred way?

    Thanks.

    Saturday, May 22, 2010 11:14 AM

Answers

  • User-2004844803 posted

    Hi,

    this is typically the kind a question you should pop at the begining off a big project, not at the end. I dont think it matters that much if you use linq or dataTables/dataSets. If you do that many calls for each page you need to reduce the creation off pages and the connections to your db. Anyway, there are much you can do at this point but it depends on your data and the state of it.

    First, you could look at how much and for how long you can cache the data. caching it for like 10 mins (or use an sql dependency) can do a lot for you.

    Second, use the output cache to cache your pages when the are created. If you cant cache the entire page chache parts off it. Remember that you can always cache userControls.

    If you have many webservers sharing the load another sugestion is to use a shared memory database (like memcached). That gives you a way to share cached data between servers and applications and that might also reduce your connections.

    And last, consider using a domain model. It might be a little tricky to introduce one this late in your work, I know that but consider it anyway. That way you can do all you talk to an in memory model and persist/load your data only when you really need to.

    Hope this helps.

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Saturday, May 22, 2010 3:45 PM
  • User1972336035 posted

    If you are using .Net framework 2.0 and above you can take advantage of Async calls to methods (query data) on your page. These methods can run in parallel or in Sequence in page life cycle. This can significently improve your page performance. Even there will be much change in you existing logic. Just check which all methods can be executed in parallel and which need to be sequenced and implement the logic.

    You take out all the SQL query methods which take longer time to execute and run them asynchronously

    Check this for example. In my page i have 2 datagrids, and both the grid can run in parallel, and i will show you how to use Async API

    Step 0 : In web.config file create a new connection string that will separately handle Async calls to DB, notice here in connection string i have added Async=true

    <add name="IKL" connectionString="Data Source=;Integrated Security=False;Initial Catalog=;Connection Timeout=240;User ID=;password=;Async=true" providerName="System.Data.SqlClient"/>

    Step 1: IN .aspx page in @Page directive add - Async="true" , Also added 2 data grid grd_Product and grd_SubProduct

    <td valign="top"><asp:DataGrid ID="grd_Product" runat="server"></asp:DataGrid></td> 

    <td width="20">---&nbsp;</td>

    <td valign="top"><asp:DataGrid ID="grd_SubProduct" runat="server"></asp:DataGrid></td>

    Step 2: In the Page Load event I call the AddOnPreRenderCompleteAsync method check for the complete code below, its very easy.

    public partial class _Default : System.Web.UI.Page

        {

            private SqlConnection _Conn1 = null;

            private SqlDataReader _rd = null;

            private SqlDataReader _rd1 = null;

            private SqlCommand _cmd = null;

            protected void Page_Load(object sender, EventArgs e)

            {

                if(!IsPostBack)

                {

                    AddOnPreRenderCompleteAsync(new BeginEventHandler(BeginAsyncOperation), new EndEventHandler(EndAsyncOpertation));

     

                    AddOnPreRenderCompleteAsync(new BeginEventHandler(BeginAsyncOperation1), new EndEventHandler(EndAsyncOperation1));

                }

            }

            IAsyncResult BeginAsyncOperation1(object sender,EventArgs e, AsyncCallback cd,object state)

            {

                HttpContext c = HttpContext.Current;

                string sConn = ConfigurationManager.ConnectionStrings["IKL"].ConnectionString;

                _Conn1 = new SqlConnection(sConn);

     

                string strSQL = "select * from All_LOC";

     

                _Conn1.Open();

     

                _cmd = new SqlCommand(strSQL);

                _cmd.Connection = _Conn1;

     

                return _cmd.BeginExecuteReader(cd, state);

            }

            void EndAsyncOperation1(IAsyncResult ar)

            {

                _rd1 = _cmd.EndExecuteReader(ar);

            }

            IAsyncResult BeginAsyncOperation(object sender, EventArgs e, AsyncCallback cb, object state)

            {

                HttpContext c = HttpContext.Current;

     

                string sConn = ConfigurationManager.ConnectionStrings["IKL"].ConnectionString;

                _Conn1 = new SqlConnection(sConn);

     

                string strSQL = "Waitfor Delay '00:00:03';select * from AREA";

     

                _Conn1.Open();

     

                _cmd = new SqlCommand(strSQL);

                _cmd.Connection = _Conn1;

     

                 return _cmd.BeginExecuteReader(cb,state);

               

            }

            void EndAsyncOpertation(IAsyncResult ar)

            {

                _rd = _cmd.EndExecuteReader(ar);

            }

            protected void Page_PreRenderComplete(object sender, EventArgs e)

            {

                grd_Product.DataSource = _rd;

                grd_Product.DataBind();

     

                grd_SubProduct.DataSource = _rd1;

                grd_SubProduct.DataBind();

            }

            public override void  Dispose()

            {

                if (_Conn1 != null)

                {

                    _Conn1.Close();

                }

     

                  base.Dispose();

            }

       }

     

     

     

     

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Sunday, May 23, 2010 12:08 AM

All replies

  • User-2004844803 posted

    Hi,

    this is typically the kind a question you should pop at the begining off a big project, not at the end. I dont think it matters that much if you use linq or dataTables/dataSets. If you do that many calls for each page you need to reduce the creation off pages and the connections to your db. Anyway, there are much you can do at this point but it depends on your data and the state of it.

    First, you could look at how much and for how long you can cache the data. caching it for like 10 mins (or use an sql dependency) can do a lot for you.

    Second, use the output cache to cache your pages when the are created. If you cant cache the entire page chache parts off it. Remember that you can always cache userControls.

    If you have many webservers sharing the load another sugestion is to use a shared memory database (like memcached). That gives you a way to share cached data between servers and applications and that might also reduce your connections.

    And last, consider using a domain model. It might be a little tricky to introduce one this late in your work, I know that but consider it anyway. That way you can do all you talk to an in memory model and persist/load your data only when you really need to.

    Hope this helps.

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Saturday, May 22, 2010 3:45 PM
  • User1972336035 posted

    If you are using .Net framework 2.0 and above you can take advantage of Async calls to methods (query data) on your page. These methods can run in parallel or in Sequence in page life cycle. This can significently improve your page performance. Even there will be much change in you existing logic. Just check which all methods can be executed in parallel and which need to be sequenced and implement the logic.

    You take out all the SQL query methods which take longer time to execute and run them asynchronously

    Check this for example. In my page i have 2 datagrids, and both the grid can run in parallel, and i will show you how to use Async API

    Step 0 : In web.config file create a new connection string that will separately handle Async calls to DB, notice here in connection string i have added Async=true

    <add name="IKL" connectionString="Data Source=;Integrated Security=False;Initial Catalog=;Connection Timeout=240;User ID=;password=;Async=true" providerName="System.Data.SqlClient"/>

    Step 1: IN .aspx page in @Page directive add - Async="true" , Also added 2 data grid grd_Product and grd_SubProduct

    <td valign="top"><asp:DataGrid ID="grd_Product" runat="server"></asp:DataGrid></td> 

    <td width="20">---&nbsp;</td>

    <td valign="top"><asp:DataGrid ID="grd_SubProduct" runat="server"></asp:DataGrid></td>

    Step 2: In the Page Load event I call the AddOnPreRenderCompleteAsync method check for the complete code below, its very easy.

    public partial class _Default : System.Web.UI.Page

        {

            private SqlConnection _Conn1 = null;

            private SqlDataReader _rd = null;

            private SqlDataReader _rd1 = null;

            private SqlCommand _cmd = null;

            protected void Page_Load(object sender, EventArgs e)

            {

                if(!IsPostBack)

                {

                    AddOnPreRenderCompleteAsync(new BeginEventHandler(BeginAsyncOperation), new EndEventHandler(EndAsyncOpertation));

     

                    AddOnPreRenderCompleteAsync(new BeginEventHandler(BeginAsyncOperation1), new EndEventHandler(EndAsyncOperation1));

                }

            }

            IAsyncResult BeginAsyncOperation1(object sender,EventArgs e, AsyncCallback cd,object state)

            {

                HttpContext c = HttpContext.Current;

                string sConn = ConfigurationManager.ConnectionStrings["IKL"].ConnectionString;

                _Conn1 = new SqlConnection(sConn);

     

                string strSQL = "select * from All_LOC";

     

                _Conn1.Open();

     

                _cmd = new SqlCommand(strSQL);

                _cmd.Connection = _Conn1;

     

                return _cmd.BeginExecuteReader(cd, state);

            }

            void EndAsyncOperation1(IAsyncResult ar)

            {

                _rd1 = _cmd.EndExecuteReader(ar);

            }

            IAsyncResult BeginAsyncOperation(object sender, EventArgs e, AsyncCallback cb, object state)

            {

                HttpContext c = HttpContext.Current;

     

                string sConn = ConfigurationManager.ConnectionStrings["IKL"].ConnectionString;

                _Conn1 = new SqlConnection(sConn);

     

                string strSQL = "Waitfor Delay '00:00:03';select * from AREA";

     

                _Conn1.Open();

     

                _cmd = new SqlCommand(strSQL);

                _cmd.Connection = _Conn1;

     

                 return _cmd.BeginExecuteReader(cb,state);

               

            }

            void EndAsyncOpertation(IAsyncResult ar)

            {

                _rd = _cmd.EndExecuteReader(ar);

            }

            protected void Page_PreRenderComplete(object sender, EventArgs e)

            {

                grd_Product.DataSource = _rd;

                grd_Product.DataBind();

     

                grd_SubProduct.DataSource = _rd1;

                grd_SubProduct.DataBind();

            }

            public override void  Dispose()

            {

                if (_Conn1 != null)

                {

                    _Conn1.Close();

                }

     

                  base.Dispose();

            }

       }

     

     

     

     

    • Marked as answer by Anonymous Thursday, October 7, 2021 12:00 AM
    Sunday, May 23, 2010 12:08 AM
  • User-952121411 posted

    In every call a page runs 5-12 different queries.
     

    For example in the class I have getAllProducts, getOldestProducts etc...

    This late in the game, you want to find less obtrusive ways of improving performance without overhauling the entire project (unless you were required to do that).  You will want to hone in on the queries that get data reusable between pages, controls, or even at the application level between clients.  For example, GetAllProducts(); in your application do the products returned change often, or is this mostly static data?  If the later of the two, you can implement caching data either at the application level or per user via Session storage.  The following links may help give you some ideas for your current setup:

    ASP.NET Caching: Techniques and Best Practices:

    http://msdn.microsoft.com/en-us/library/aa478965.aspx

    ASP.NET State Management Recommendations:

    http://msdn.microsoft.com/en-us/library/z1hkazw7.aspx

    ASP.NET Session State Overview:

    http://msdn.microsoft.com/en-us/library/ms178581.aspx

    Monday, May 24, 2010 2:46 PM
  • User-1874166561 posted

    You should also use the SQL Profiler tool and take a look at the SQL execution plans in SQL Management Studio to verify that your queries are running as optimal as possible.

    You mentioned that you are passing a SQL string that gets executed. You should check out sp_executesql and passing explicit parameters because from what I know, SQL will cache the execution plan for those queries upon the first execution. So if your app submits the same queries over and over again, with differing parameters, then it will execute almost or just as fast as a stored procedure and avoid the overhead of parsing and figuring out execution plans.

    Then you should make sure that you properly indexed your database. I have come across numerous apps that performed poorly because the app logic submitted a few hundred or thousand queries for each page and each query took several ms. Obviously, the apps and DBs were created by developers who did not know much about databases ;-)

    Simply providing the proper indexes to avoid table and index scans already provided a decent performant boost. Next step was to optimize how the app queries (e.g. for tree hierarchies query all nodes on one level rather than individual queries for the children of each node). Your up to 12 queries don't sound like that much. Unless they are longer-running and your app needs to be superperformant, you'll likely be OK.

    The app-level method how you perform your queries is USUALLY irrelevant for performance. Delay is in the overhead to execute a query at all, and in how much time the query itself consumes.

    Christoph

    Tuesday, May 25, 2010 3:43 PM