Thursday, January 11, 2007 1:32 AM*This is also Posted in the .NET Framework Data Access and Storage
Well, I'm just pulling hairs trying to figure out the following issue.
This example performs the same way with SqlConnection/OdbcConnection/OleDbConnection etc.
I'm connecting to A SQL Server 2000 Server. My development is done remotely so I have a PPTP VPN connection between my development box and my DB Server.
I'm running .NET aspx code C#.
This code is being run against a very large table (I tried various other large tables with same result)
I've isolated the problem with the DataReader.Close() function. It just takes a VERY long time to return. The more data is in the Result Set of the SELECT statement, the longer it takes. So if I add TOP 10000 to the SELECT statement, it takes 5-6 seconds to close....and it get progressively worst.
And while Close() is hung, I check my VPN thruput (DB Access is only data flowing), I can see a constant Download stream activity going on until the Close() returns. What is going on, why would Close force a fetch of the whole dataset? A Close() should be able to close off the Result set on the SQL server by just dropping it no?
Connection = new SqlConnection("HIDDEN");
Command = new SqlCommand("SELECT * FROM AlarmLog",Connection);
//The execute is very quick (< 1 second)
Reader = Command.ExecuteReader();
//The Close takes a VERY long time
Thursday, January 11, 2007 1:56 AM
The notes on the Close method for SqlDataReader states:
The Close method fills in the values for output parameters, return values and RecordsAffected, increasing the time that it takes to close a SqlDataReader that was used to process a large or complex query. When the return values and the number of records affected by a query are not significant, the time that it takes to close the SqlDataReader can be reduced by calling the Cancel method of the associated SqlCommand object before calling the Close method.
Try Cancelling before closing to see if that helps.
Thursday, January 11, 2007 2:17 PMYeah it helps, thanks for the tip. What baffles me though is that this behavior seems to be also affecting calls like
OdbcDataAdapter Adapter = new OdbcDataAdapter(szSQL,Connection);
DataSet ds = new DataSet();
int iResult = Adapter.Fill(ds,0,50,"DEFAULT");
The Fill method, even when used to populate only 50 rows of the dataset suffers from the same delay. I'm thinking that internally to the method , a DataReader is used and Closed without Canceling.
How can one possibly use this method (using the Adapter) for displaying data in a DataGrid when the ResultSet is a query against a large table?
Friday, January 12, 2007 11:43 PMMoving to the .NET Framework Data Access and Storage forum.
Sunday, January 14, 2007 10:30 PMModerator
The trick is to ask for the rows you need and no more. Easier said than done, but asking for the phone book when all you need is a local dentist listing is an all-too-common design problem. This makes the query engine work to its potential (because it can use the indexes more efficiently and not perform a table-scan), reduces the net traffic, reduces load on the server cache (which flushes perfectly good data to fetch stuff the user does not need) and reduces the amount of stuff being managed by Fill, Load or Read. See www.betav.com FMI on my new book that covers this in detail.