locked
DLINQ DataContext Memory Leak RRS feed

  • Question

  • I am working with my fellow team mates on distributed application and we choose, DLINQ as your Data Access Layer. DLINQ is very, very cool. But I have noticed slight memory increases when using multiple datacontexts.

    Let me give you an example :

    Assume we have a service called FooService, which has a method called GetBunny();

    Each time the FooService.GetBunny() is called a new DataContext is created, and this datacontext is passed down the layers to the DAL, example :

    public class Service : ServiceBase
    {
           public Bunny GetBunny()
          { 
                     IBunnyManager bunnyManager = CreateBunnyManager(this.DataContext);
          
                     return bunnyManager.GetBunny();
          }
    }

    As we see above, obviously we have decided that we put the infrastructure code of creating the DataConxests in the abstract class.

    This creates slights memory leaks. I mean using each time a new datacontext for each request. Yes, we do call Dispose on the dataContext when we dispose the WCF Service (yes, it is a WCF Service).
    So, my question is ... what are the best practises for using multiple datacontexts in a Service Oriented Application.

    Thanks Smile
    Wednesday, April 11, 2007 7:14 AM

All replies

  • When I hear about memory leaks in .NET, the first I think about is references that keep the instances from being garbage collected. I haven't noticed the memory leaks you're mentioning, but then again, I haven't deployed LINQ to SQL in a large scale project (yet).

     

    Unless you're loading entities from the DataContext with deferred loading disabled, each entity will keep a live reference to the DataContext. We need more code snippets to determine, whether or not you've got dangling references.

     

    Another answer to your question about best practices is, that it seems like LINQ to SQL in its current version (as available from the March 2007 CTP) is not intended as the middle tier you're trying to build. In a SOA architecture, you definately don't want your entities to remain connected to the DataContext as it imposes the risk of memory leaks as is seen from your results.

     

    Several posts in this forum are discussing the "disconnect" problem and we're hoping that the LINQ team will respond to the questions.

    Wednesday, April 11, 2007 8:26 AM
  • Profiling the application, I have noticed that the LINQ caches the metadata that describes the
    database. This metadata is never releases. What does LINQ do with that metadatta, why doesn't is release it. I guess here the LINQ team took the presumption that the database structure will not change, and if it changes the programmers will re-generate their bussiness entities. But then again I have the strange feeling that the datacontext is not beeing completely collected from the GC.
    I will investigate the issue in more detail today Smile

    Thanks Smile
     
    Wednesday, April 11, 2007 9:02 AM
  •  

    From what I can gather from previous comments by the LINQ team, the DataContext is not reusing the model meta data across DataContext instances (though the final version is supposed to exhibit this behaviour for obvious reasons). It makes sense to cache the model meta data as it's not likely that those meta data change in the lifetime of the DataContext (or even the lifetime of the application domain - correct me if I'm wrong).

     

    However, perhaps the LINQ to SQL framework could provide with a DataContext.FlushMetaData() or DataContext.Flush() method that allows programmatic (thread safe) flushes of the internal caches.

    Wednesday, April 11, 2007 9:25 AM
  • Ok, I am using the March CTP and the following testing code :

            static void Main(string[] args)
            {
                using (DatabaseFactory.CreateDatabase())
                {
                    SciTech.NetMemProfiler.MemAssertion.BeginAssertions();
                    SciTech.NetMemProfiler.MemProfiler.FullSnapShot("Begin");
                 
                    using (MyDataContext context = DatabaseFactory.CreateDatabase())
                    {
                            IDataProvider provider = ObjectBuilder.BuildUp<IDataProvidier>(context));

                            int someData = provider.GetSomeData();
                              
                    }
                    GC.Collect();
                    Thread.Sleep(1000);
                    GC.Collect();
                    SciTech.NetMemProfiler.MemProfiler.FullSnapShot("End");
                    SciTech.NetMemProfiler.MemAssertion.EndAssertions();
                
                }

      Ok, I am using the "using (DatabaseFactory.CreateDatabase())" in order to force LINQ to cache some of it's metadatta, because that is what it does Wink actually when the first time a datacontext is beeing used to read the database.
    So, I am using .NET Memory Profiler 3.0 and seems like the System.Data.Linq.SqlClient.ObjectReaderBuilder+ObjectReader<TObject> ( as it is the name of the type in the Reflector) is not beeing freed and thus consuming 50 000 bytes of memmrory :/

    IDataProvider does the following query :
    DateTime today = DateTime.Now;
            
                IQueryable<int> id =
                    from someTable in dataContext.SomeTable
                    where someTable.StartTime > today
                                    select someTable.ID;

    Ok, I am getting really confused now ... please some of the LINQ team to reply Smile
    Thanks Smile



    Wednesday, April 11, 2007 4:24 PM
  • The DataContext does not maintain any static/global data so eventually the GC will free up anything it does create.  The mapping information can be cached by storing it in a static/global variable.  SQLMetal will generate custom DataContext's thatdo  cache your mapping information so the same mapping is used for every instance.  It's not required that you use mapping this way.  The DataContext can be constructed with a reference to any specific MappingSource you choose, so you can throw these away and recreate them at any time.

    Thursday, April 12, 2007 4:19 AM
  • Yes, the datacontext as I can see does not contain any static data, but then again ... memory is not freed. Yes, I know what the sqlmetal does, but ... I have hard cold evidence of a memory leak Wink
    Thursday, April 12, 2007 8:02 AM
  • It could just be the runtime keeping hold of big blocks of memory it allocated.   If you run your test 10 times, does the leak get 10 times bigger?  50 times?
    Thursday, April 12, 2007 5:11 PM
  • If a run this linq query 100 times :

        IQueryable<SomeTable> result =
          from SomeTable t in context.SomeTable
          where (some condition)
          select t;
       
         The memory does not go up 100 times Smile

         But if I do this query 100 times :
     
         IQueryable<SomeOtherTable> result =
          from SomeOtherTablet in context.SomeOtherTable
          where (some condition)
          select t;
        
         The memory goes up exactly as much as it went wen a run the first query, so :
         1st query (1 or 100 times of execution) : around 90 000 bytes
         2nd query(1 or 100 times of execution) : around 90 000 bytes

         Yes, I can see that the LINQ eventually will free up the data some time later ( and in  memory there are a lot of SqlColumn<T> where T is SomeOtherTable and SomeTable ), but it always remains 63 to 71 bytes when a datacontext is created, that beeing in running environment. Unfortunetly when running the unit tests there is a memory leak, but when running the server the memory gets freed up, but those 63 to 71 bytes for the datacontext creation remain, so out concern is that if we have let's say 5 000 requests to the application server, that means 5 000 x 63 bytes ( because each request creates it's own datacontext ). Ok that's NOT that much, but if the application server is heavily loaded LINQ might allocate A LOT of memory. Smile

    Btw looking forward to PLINQ.
     
    Friday, April 13, 2007 7:48 AM
  • Hi,

    I use LinQ in a Windows service to do some repetitive Job. I experience a big problem of memory leak, my service grows from 19Meg of memory used up to 900 Meg in just a week end of continious running.

     

    After testing différent solutions, I found that memory leak exist with LinQ when using the "Refresh" method. I used it because my service always needs fresh data up to date with the database. And we know that LinQ return only memory data if it had them already read.

     

    So it seems that the Refresh method as a problem about it.

     

    Sylvain

    Monday, June 23, 2008 4:17 PM
  • I think linq to sql caches metadata because the creation of each new datacontext it's very heavy. because each time he has to load all metadata, and to optimize that process I think that is why he maybe caches metadata, in order to speed new datacontext creation. about leaks never heard about .net memory leaks
    Tuesday, June 24, 2008 9:32 AM
  •  

    same here....2 applications and they all have memory leaks.the only difference between them and other applications is the fact they both use linq.It's no0t being used accross threads...in fact any threading procedure always uses it's own datacontext...but the leak gets huge after a few days.Still haven't been able to quite trace the problem...but here's a question

     

    suppose you make a class which hosts the datacontext...then you pass this class as a parameter to a few functions which each do their own thing and submit their own changes...and offcourse after you pass it through all the required functions you dispose of the datacontext.

    Is this a good approach?The class never exposes or stores linq iqueryable objects...but the leak is still there.

    I am hoping it's related to some other bug which was done in both applications but still...

    Wednesday, November 26, 2008 10:10 AM
  • DataContext's that are generated via the designer (and command line tool) keep a copy of the mapping source in a global variable; so this data will stay loaded for the lifetime of the app domain.  Using additional instances of DataContext won't increase this memory.  The mapping source may keep a different variation of the loaded mapping per sub-type of DataContext that uses the mapping though. So if you have multiple types of DataContexts and share the same global mapping you may get additional memory usage. Likewise, the Refresh method uses an additional DataContext instance under the covers (just a plain DataContext) with the same mapping, so using Refresh could increase the memory usage of the mapping. 

     

    However, unless you are repeatedly introducing new types of DataContexts or new types of mappings you won't see a constant increase in memory usage caused by this.

     

    If you would like to control the lifetime of the loaded mapping information you can forgo using the global variable generated by the designer and instead keep your mapping information in your own allocated MappingSource object and pass it to the DataContext's constructor.

     

     

     

     

     

    Wednesday, November 26, 2008 5:02 PM