locked
Best approach for this scenario. RRS feed

  • Question

  • Hi to all,

    May I know what is the best approach for this.

    I have a CTE query a complex one, more inner joins and left joins... Our requirements is to do a paging so we use a Row_Number() to do paging and displaying 20 rows. And also we have also display the total data which is around 50,000 plus.

    What I did is to use the same query (without Row_Number) but instead using count(*) to display the number of rows which is pass to a variable. Below that query-count is the query (CTE) that i used to display the data.

    May I know what is the best approach for this? Performance wise?

    <Count query>

    <CTE query with paging>


    3 stars and a sun

    Saturday, May 11, 2013 11:31 AM

Answers

  • CREATE TABLE saved_searches
        (session_key uniqueidentifier NOT NULL,
         row_no      int              NOT NULL,
         col1        ...
         col2        ...
         CONSTRAINT pk_saved_searches (session_key, row_no)

    go
    INSERT saved_searches (session_key, row_no, col1, col2, ....)
       SELECT @session_key,
              row_number(ORDER BY ordercol1, ordercol2, ...),
              col1, col2, ...
       FROM   tbl1
       ...

    Now the search been executed and the result has been saved. To retrieve the data you do:

      SELECT col1, col2, ...
      FROM   saved_searches
      WHERE  session_key = @session_key
        AND  row_no BETWEEN @start AND @start + @page_size - 1
      ORDER  BY row_no

    Moving to a new page will be quick, but the initial load could of course take some time. That is difficult to tell from where I sit.

    Another option to consider is to read bigger batches into memory directly, say 1000 at a time, and then serve the user 20 at a time. In that case, you only need a database roundtrip, if the user goes further away in the result set.

    Note that since you asked a generic question, you can only get generic answers. The approach you have currently, may be the best one for your actual case.

    By the way, personally I hate sites that only gives me 20 items at a time, so that I have to page forth and back.


    Erland Sommarskog, SQL Server MVP, esquel@sommarskog.se
    • Proposed as answer by Kalman Toth Sunday, May 19, 2013 5:08 PM
    • Marked as answer by Allen Li - MSFT Tuesday, May 21, 2013 7:49 AM
    Sunday, May 12, 2013 9:57 AM

All replies

  • One option is to insert all data into a session table, and then feed the paging from that table. In that way, feeding a new pages is quick as you can do:

     SELECT ...
     FROM   tbl
     WHERE  session_id = @session_id
       AND  rowno BETWEEN @pagestart and @pagestart + @pagesize

    But you will need to implement a way to age out the result from old searches.


    Erland Sommarskog, SQL Server MVP, esquel@sommarskog.se
    Saturday, May 11, 2013 11:43 AM
  • If we query all the data without the paging, it takes around 5-8 seconds. So inserting to temp table is not our options.


    3 stars and a sun

    Saturday, May 11, 2013 11:56 AM
  • It's a difference between insert the data into a table, and returning the data to a client. The latter includes network overhead etc. You should try this before you dismiss my suggestion.

    But of course, adding 50000 rows to a session table incurs some overhead, particular when the user only looks at the first couple of pages.

    Anyway, you asked for a advice on a query I don't see, on tables I don't see etc. You cannot expect very exact advice in this case.


    Erland Sommarskog, SQL Server MVP, esquel@sommarskog.se
    Saturday, May 11, 2013 1:14 PM
  • Hi,

    could you give me an example, maybe i just misunderstand you. Thanks


    3 stars and a sun

    Sunday, May 12, 2013 1:50 AM
  • CREATE TABLE saved_searches
        (session_key uniqueidentifier NOT NULL,
         row_no      int              NOT NULL,
         col1        ...
         col2        ...
         CONSTRAINT pk_saved_searches (session_key, row_no)

    go
    INSERT saved_searches (session_key, row_no, col1, col2, ....)
       SELECT @session_key,
              row_number(ORDER BY ordercol1, ordercol2, ...),
              col1, col2, ...
       FROM   tbl1
       ...

    Now the search been executed and the result has been saved. To retrieve the data you do:

      SELECT col1, col2, ...
      FROM   saved_searches
      WHERE  session_key = @session_key
        AND  row_no BETWEEN @start AND @start + @page_size - 1
      ORDER  BY row_no

    Moving to a new page will be quick, but the initial load could of course take some time. That is difficult to tell from where I sit.

    Another option to consider is to read bigger batches into memory directly, say 1000 at a time, and then serve the user 20 at a time. In that case, you only need a database roundtrip, if the user goes further away in the result set.

    Note that since you asked a generic question, you can only get generic answers. The approach you have currently, may be the best one for your actual case.

    By the way, personally I hate sites that only gives me 20 items at a time, so that I have to page forth and back.


    Erland Sommarskog, SQL Server MVP, esquel@sommarskog.se
    • Proposed as answer by Kalman Toth Sunday, May 19, 2013 5:08 PM
    • Marked as answer by Allen Li - MSFT Tuesday, May 21, 2013 7:49 AM
    Sunday, May 12, 2013 9:57 AM