none
Get Size of DataReader Payload and reserve memory to avoid out of memory exception RRS feed

  • Question

  • Hi,

    I am facing an issue where the returned rows from a database call could run into millions of rows resulting in out-of-memory exception in the client app.

    1. What is the best way to figure out the amount of megabytes required to handle a datareader's dataload
    2. What is the best way to ensure there is enough memory available for the payload - pin-it if required before unloading the datareader

    We typically recieve the datareader, iterate through the rows and create our object collection dynamically, then bing this collection to our display grid.


    Even though the existence of God cannot be determined through reason, a person should wager as though God exists, because living life accordingly has everything to gain, and nothing to lose - Pascal Blaise
    Monday, May 23, 2011 9:53 PM

Answers

  • I'm afraid the Out-Of-Memory issue is not caused by the DataReader, because DataReader object just reference to a underlying stream which stored on the client, as this document says:

     

    You can use the ADO.NET DataReader to retrieve a read-only, forward-only stream of data from a database. Results are returned as the query executes, and are stored in the network buffer on the client until you request them using the Read method of the DataReader.

     

    Instead, the root cause is the object collection which used to stored rows (read from DataReader), millions of rows makes the object collection very big, so you receive the Out-Of-Memory exception.

     

    Do not store all rows into a collection at a time, because along with growth of data, you will eventually find that 2GB memory is not enough.

     

    You can choose to use paging, and put your efforts on improving performance and user experience  of paging;


    Eric Yang [MSFT]
    MSDN Community Support | Feedback to us
    Get or Request Code Sample from Microsoft
    Please remember to mark the replies as answers if they help and unmark them if they provide no help.

    • Marked as answer by Elbino Monday, June 6, 2011 4:10 PM
    Wednesday, May 25, 2011 3:45 AM

All replies

  • The virtual memory space of 32bit process is 4GB, and of which only about 2GB available for process itself. Migrate to 64bit if your application needs to deal with huge mount of data;

     

    In the other side, is it really need to retrieve all records (millions of) from database at a time? Why not show data to user page-by-page? So that we can just query a page of records each time.


    Eric Yang [MSFT]
    MSDN Community Support | Feedback to us
    Get or Request Code Sample from Microsoft
    Please remember to mark the replies as answers if they help and unmark them if they provide no help.

    Tuesday, May 24, 2011 5:57 AM
  • Thanks  Eric.

    You know how it is, migration is a always a huge process.

    We tried the page-by-page implementation, but our users complained it was taking too much time to scroll, filter, sort etc, cause all these had to be performed on the server.

    So we started looking into determining the size of the dataset and making sure the client has enough memory to hold it, if not, just inform them that page-by-page was the only option


    Even though the existence of God cannot be determined through reason, a person should wager as though God exists, because living life accordingly has everything to gain, and nothing to lose - Pascal Blaise
    Tuesday, May 24, 2011 6:53 PM
  • I'm afraid the Out-Of-Memory issue is not caused by the DataReader, because DataReader object just reference to a underlying stream which stored on the client, as this document says:

     

    You can use the ADO.NET DataReader to retrieve a read-only, forward-only stream of data from a database. Results are returned as the query executes, and are stored in the network buffer on the client until you request them using the Read method of the DataReader.

     

    Instead, the root cause is the object collection which used to stored rows (read from DataReader), millions of rows makes the object collection very big, so you receive the Out-Of-Memory exception.

     

    Do not store all rows into a collection at a time, because along with growth of data, you will eventually find that 2GB memory is not enough.

     

    You can choose to use paging, and put your efforts on improving performance and user experience  of paging;


    Eric Yang [MSFT]
    MSDN Community Support | Feedback to us
    Get or Request Code Sample from Microsoft
    Please remember to mark the replies as answers if they help and unmark them if they provide no help.

    • Marked as answer by Elbino Monday, June 6, 2011 4:10 PM
    Wednesday, May 25, 2011 3:45 AM
  • Thanks a million.
    Even though the existence of God cannot be determined through reason, a person should wager as though God exists, because living life accordingly has everything to gain, and nothing to lose - Pascal Blaise
    Monday, June 6, 2011 4:09 PM