Friday, June 24, 2005 5:27 PMI have a multi-process application that passes significant amounts of data between processes using serialization via a binary formatter. Examining the memory usage via profiling tools such as the CLR Profiler, it apears that the ObjectManager.RegisterFixup() method is calling ObjectHolder.AddDependency(), which is instantiating a large number of arrays (char, byte, etc.) that are never garbage collected, and this eventually brings our application to a standstill as memory use grows.
As the application runs, the objects that were passed in the past become irrelevant and all references to them in our application are removed, and it's my intent that those objects be garbage collected to provide room for the new objects that are deserialized. Newly-deserialized objects have no references to previously-deserialized objects (each batch of deserializations contains no references to other batches), so there is no need for the ObjectManager to keep information about previously-deserialized objects, which means that the list of ObjectHolders can be cleared before the beginning of the deserialization of the next batch. However, I haven't been able to find a means of clearing obsolete ObjectHolders out of the ObjectManager so that the memory can be garbage collected; does one exist? If not, it sounds like the Serialization framework isn't up to the task I'm asking it to do and I'll need to do my own manual serialization and deserialization without using the Serialization library.
Thanks for any help you can give...
Thursday, July 14, 2005 12:04 AMModeratorIf you could send a small repro that would be great.
Thursday, July 14, 2005 12:23 PMA reproduction isn't going to happen; my only access to a development environment is on a classified network.
However, the question is a theoretical one that therefore requires no reproduction: ObjectManager instantiates objects during deserialization and maintains pointers to them so that future deserialized objects that refer to a prior object can be given a pointer to that object. By maintaining pointers to all of these objects, each object's pointer count is upped by 1, which means that for as long as the ObjectManager holds that reference, the object won't be garbage collected. (If my understanding thus far is flawed, please point out where I'm looking at this wrong.)
In looking through some code for ObjectManager that I found on the Web (which might not be the correct, current code in use in the .NET framework, but it's the best guess I've been able to find), there doesn't appear to be any provision for clearing object references out of the ObjectManager's fixup list, which means that unless there's something I've overlooked or don't know about (again, please tell me if that's the case), even when my application is through with an object and is holding no references to it, the ObjectManager still has a reference and will therefore prevent the garbage collector from collecting the unused objects. When there are multiple gigs of objects being deserialized, this would mean that the application's memory consumption would continually increase until the system runs out of memory, which is the behavior I'm seeing (though the behavior could be from something entirely unrelated; investigating the cause of the increase is what got me started on this to begin with).
Am I missing something?
Friday, July 15, 2005 2:13 PM
Delete and reinstantiate the BinaryFormatter? After implementing a formatter and an ObjectManager, it seems the only way to remove the references (and reset the object ID generator as well) is to delete the formatter and reinstantiate it. It could be the model was meant to be capable of deserializing multiple dependent graphs.
Friday, July 15, 2005 8:27 PMModeratorThat is certainly a workaround. I am looking into it.
Saturday, July 16, 2005 8:38 AMModeratorCreating a new instance of the formatter is the way to do it.