none
dataset slow RRS feed

  • Question

  • I have an application that is concatenating to a string  (of course I'm using the stringbuilder class and appending)  that is REALLY slow.

    The query to the oracle database is very large so I go to the database to get some of the records - that I append and write to the file.  Now, I go back to the database and get some more of the records, append and write to the file, too. 

     So, each time through the loop I have maybe 50000 rows in my dataset.  I append the rows from the dataset together and write it to a file.  Well, the query is quick and so is writing it to the hard drive.  The machine slows down during the concatenation. 

    I checked for thrashing and I've upped my swap files - but the server does not have a hardware problem, really.

    What's up with the dataset?  Or, is there a limit for the size of a string in the string builder class?  Do I need to switch to a byte array - just to write this to a file?  Should I try an array of strings - although I can't see how this would help.  This has to go to a flat file - to work with other server systems.

    Any suggestions would help. 

    Thanks! 

    Friday, December 22, 2006 1:32 AM

All replies

  • well you are dealing with a large number of records so yes this is probably an expected result. you should only grab the data you require at any one time rather than large chunks of data. Can you also show us some code?
    Friday, December 22, 2006 1:41 AM
  • What are you trying to do in regards to concatenating the string?  Are you trying to put everything in a large string to email/print/view? 

    I'm not sure what the specific limitations to StringBuilder, but I've never experienced the issue that you are referring to.  I have had issues with pulling large blocks of data from Oracle however. 

     

     cttnpckn wrote:

    I have an application that is concatenating to a string  (of course I'm using the stringbuilder class and appending)  that is REALLY slow.

    Friday, December 22, 2006 2:51 AM
  • Sounds like you can gain some performance by using a filestream and use a DataReader directly instead of a DataSet. That way you don't have to get the table in chunks because it's already serialized. Somehting like this:

    DbCommand command = //    create your oracle command with the complete tabel
    DbDataReader reader = command.ExecuteReader();           
    using (StreamWriter sw = new StreamWriter(@"C:\Test.txt"))
        while (reader.Read())
        {
            sw.WriteLine(reader.GetString(0));//  the 0 stands for the first column
        }

     

    Friday, December 22, 2006 9:35 AM
  • Yup - the datareader helped  I checked and no, I wasn't using it -but I should have.

    Nope, I had cleaned out the string each go-through.

    I also set up an interval - so I only got so many results at a time - I can set how many I want now.  That helps alot. 

    It just must be that when that string gets really big - everything slows down

    So, doing lots of iterations solved it.

    Tuesday, December 26, 2006 7:27 PM