none
DataSet Update RRS feed

  • Question

  • I have an application which uses DataSet.WriteXML to export data and DataSet.ReadXML to import data. During the import process I need to change certain primary keys as part of the application logic. 

    When there are over 500K records, it writes to XML and read from XML successfully. Once I change the primary key it waits some time and throws an OutOfMemory exception. The reason as I believe is, it has to do a lot of cascade updates. I tried BeginEdit and EndEdit, but still failing when I call EndEdit. 

    As I have understood, DataSets keeps some of previous data also in memory. Is there any way to optimize DataSet update operations in a way it consumes minimum memeory?

    CJ

    Monday, February 25, 2013 12:39 AM

All replies

  • CJ,

    If you use DataSet WriteXML and DataSet ReadXML then the cascade setting is only meant how the string is representated which is written at WriteXML time. 

    The dataset in memory is a collection of items (objects) beside that the dataset contains many references (that the representation shows often circular references which don't exist but are showed because they are circular referenced) 

    Be aware on disk it is a string, so as soon as you start using it as a string it will take a bunch of memory. As object it is impossible to say how much data it will contain. Also be aware that with a 32bits application the maximum memory which is addressable by your program is 2Gb.


    Success
    Cor

    Tuesday, February 26, 2013 8:34 AM
  • Thanks Cor, 

    I am aware of that  nightmare. This is what previous developers has left for me and I am trying to do some enhancements as much as possible without changing the whole implementation. Do you know any way to reduce memory consumption when updating any primary key which has got many references?


    CJ

    Wednesday, February 27, 2013 2:49 AM
  • How (and why?) are you changing the PK?

    ~~Bonnie Berent DeWitt [C# MVP]

    geek-goddess-bonnie.blogspot.com

    Wednesday, February 27, 2013 3:01 PM
  • Hey

    I am not sure, if you still have the problem or not. - Because it is not marked as resolved.

    Have you tried to set the EnforceConstraints to False, before you update the PK's?

    Maybe you can handle this problem by this way?

    Hope this will help!

    Tuesday, May 28, 2013 10:35 AM
  • Thanks Cor, 

    I am aware of that  nightmare. This is what previous developers has left for me and I am trying to do some enhancements as much as possible without changing the whole implementation. Do you know any way to reduce memory consumption when updating any primary key which has got many references?


    CJ

    Not if you get the dataset from a file from disk. There is no mechanism to read or write a partial XML dataset.

    However, I strongly advise you to start with a real DataBase like SQL Server compact or SQL Server Express, which cost the same like a string on disk. 

    The backside of it is that it seems that you have a not so well organized dataset (because you have to reorganise PK which is not normal with that), therefore using a real SQL database server with tools to organise that will probably be better.

     

     

    Success
    Cor

    Sunday, June 2, 2013 11:22 PM
  • You *do* realize that this thread is more than 3 months old! The OP has probably solved the problem by now. ;0)

    ~~Bonnie Berent DeWitt [C# MVP]

    geek-goddess-bonnie.blogspot.com

    Sunday, June 2, 2013 11:41 PM
  • Now I do but I didn't

    Thanks.

    :-)


    Success
    Cor

    Tuesday, June 4, 2013 4:07 PM