locked
How reduce result set to every nth row RRS feed

  • Question

  • User1755521097 posted

    Hi, for a certain query I can get up to a couple of millions of rows in form of time series data. These rows need to be display on a screen with a limited amout of pixels so it makes sense to blur them or in other words to only select every nth row of the original set. Using SQL I'd do something like this (source stackoverflow)

    SELECT t.id, t.key
    FROM
    (
        SELECT id, key, ROW_NUMBER() OVER (ORDER BY key) AS rownum
        FROM datatable
    ) AS t
    WHERE t.rownum % n = 0   
    ORDER BY t.key

    The problem is, I would like to do this with LINQ but I just can't figure out how. A method ROW_NUMBER is not known and using an index in the where clause gives me an error.

    Query.Where((s, i) => i % n == 0);

    Edit: more precisely

    Query.Select((s, i) => new { index = i, data = s } ).Where(s => s.index % n == 0).Select(s => s.data)

    The only solutions that worked for me so far is to temporarily run the SQL (toList) and filter it afterwards. This however slows down my server significantly. 

    Tuesday, February 6, 2018 9:30 AM

All replies

  • User1400794712 posted

    Hi rstch,

    LINQ to Entities doesn't support Where((s, i) =>xxx)/Select((s, i) =>xxx), so the where clause will cause an error. LINQ to Entities has many limitations, we'd better use tolist() to convert it to LINQ or add a column in database to store the Row Number.

    LINQ not LINQ to Entities

    var list = db.ModelName.ToList().Where((j, i) => i % 30 == 0);

    Best Regards,

    Daisy
    Thursday, February 8, 2018 8:18 AM
  • User1755521097 posted

    Hi Daisy

    Thanks for the answer. The way you describe it is how it is implemented now. As I wrote, it slows down my server very much so it is to a reasonable solution. 

    Cheers

    Thursday, February 8, 2018 8:28 AM
  • User-166373564 posted

    Hi, 

    As I wrote, it slows down my server very much so it is to a reasonable solution. 

    If you have a many rows table with large amount of data using row_number() to get every Nth row, it will be slow.

    Regards,

    Angie  

    Thursday, March 1, 2018 8:17 AM