time out error RRS feed

  • Question

  • Thanks in advance.
    Actually I was trying to load XML feed from a remote location. That XML  document contains lot of data.
    I got an unhandled exception time out expired. Code that was used by me is as follows:

    dataset ds =new dataset();

    I want to know if there is any way to increase time to overcome above problem. Tell me if there is any alternate way.
    Please provide me with appropriate details if any one knows solution.
    Friday, May 30, 2008 12:13 PM

All replies

  • In my opinion it points to the bad design of application. What would be the reason to load huge sets of data remotely? Most likely you do it to process all the records and in this case application should do it on a server side instead of transferring everything remotely. If you application displays data from the loaded dataset then you should load it dynamically when clients really need to see specific sets of data rather than loading all the records at once. You might inrease time out of the web server, but it will not solve actual problem because in a case if your Dataset size increases, you would need to increase timeout as well. I would focus on changing design and see how data could be loaded "on demand"


    Saturday, May 31, 2008 1:32 AM
  • Thanks in advance.

    Actually I have been trying to load Reamole xml and then store it in the local system. From local system I will be accessing XML file(sample.xml)


    dataset ds =new dataset();


    If there is any other good way please tell me.


    Saturday, May 31, 2008 5:07 AM
  • If you don't worry about the readability of the file. I'd recommend BinaryFormatter to you.


    You could visit to know how to use it. The sample is to serialize and deserialize a Hashtable, while DataSet is the same. It would be much much faster than xml Smile


    Wednesday, June 11, 2008 5:58 AM

    You can compress the xml file before downloading it if you have the control on the server side. There are several ways to compress the file. You can use either filestreams under System.IO.Compression or third-party software like 7-zip.


    Otherwise, you may want to use WebClient to download the file in a manageable way.

    Wednesday, June 11, 2008 1:32 PM
  • Actually I think the timeout is because the time you deal with the file is too long, not the time you download it. So serialize and then compress would make it worse. To deal with the file faster, you could use BinaryFormatter.


    Thursday, June 12, 2008 5:48 AM
  • The good way would be to query data in chunks, For example, if you expect that you will get 1,000,000 rows, you could query data in chunks of 10,000 or less. Everything depends on your scenario on a client side where application is running. If application does not require all the rows at same time then you could load only the rows it requires initially and then request another set of rows for the next processing and so on.


    Tuesday, June 17, 2008 9:54 AM