none
Handling Large chunk of Data in sql Session State. RRS feed

  • Question

  • Hi,

    I have a business entity in asp.net web app, that has large chunk of data retrieved from sql data source. The entity would be binded to the grid view. I have to do manual filtering and sorting of the data. In this scenario, can i have all the big chunk of data(say 5000 records) in sql session state and do manipulations in the codebehind without calling the Data tier. Also the session of each user is eight hours and 100 users may be using the application.

    Which is the best way to handle this scenario. can we have all the data in sql session state and do the manipulations or calling the data tier to load the data and do the manipulations.Which would be the best possible solution performance and best practice wise?

    Let me know if more information required.

    Thanks in Advance.

    Monday, January 24, 2011 3:53 AM

All replies

  • If those records are any sort of significant size at all you might have a problem.  ASP.net isn't a great choice for this sort of reporting scenario with largish datasets and significant numbers of concurrent users.

    If you put too much data into session it can cause really bad problems.   Recycling the memory being the most obvious and is really bad.  Everyone looses their sessions.  On the way to this you can also suffer from weird loss of state problems.  You don't want to go there mate.

    If the data at least overlaps then you could store in a keyed cache.  So eg if 30 of the concurrent users are all going to be lookiing at Team A's sales data then you can stick it all in one shared TeamASales object in cache.   Remember though that when you then take soms subset of data out of there for your view, that's taking up another chunk of your server's memory.

    You could stick this out of process and into another server's memory to reduce load.

    Maybe you want to consider something running on the client.  A windows app - winforms or WPF maybe or silverlight or access.. Excel even.  They could pull all the data across to the client and then play with it however they like.

    Monday, January 24, 2011 10:39 AM
  • Thanks Andy for your quick response. The data would be around (2000 records * 20 coloumns). So as you suggest shall i call the data tier each time the user does any sorting/ filtering of data, instead of keeping that in session and doing the manipulations.
    Monday, January 24, 2011 5:48 PM
  • If the other alternatives won't work for you.

    2000 records is a lot of data for someone to br browsing through.

    Can you not roll the base data up into summary somehow and at least reduce part of the load?

    Wednesday, January 26, 2011 8:08 AM