none
DbContext Query Performance BAD! RRS feed

  • General discussion

  • I recently moved my entity model from an ObjectContext using 4.1 to a DbContext using 5.0. I'm starting to regret doing that because I'm noticing some very poor performance on query's using the DbContext vs ObjectContext. Here's the test scenario :

    Both contexts use the same database with about 600 tables. LazyLoading and ProxyCreation is turned off for both (not shown in code example). Both have pre-generated views.

    The test first makes 1 call to load up the metadata workspace. Then in a for loop that gets executed 100 times, I new up a context and make one call that takes the first 10. (I'm creating the context inside the for loop because this simulates being used in a WCF service, which would create the context every time)

    for (int i = 0; i < 100; i++)
                {
                    using (MyEntities db = new MyEntities())
                    {
                        var a = db.MyObject.Take(10).ToList();
                    } 
                }

    When I run this with the ObjectContext it takes about 4.5 seconds. When I run it using the DbContext it takes about 17 seconds. I profiled this using RedGate's performance profiler. For the DbContext it seems the major culprit is a method called UpdateEntitySetMappings. This is called on every query and appears to retrieve the metadataworkspace and cycle through every item in the OSpace. AsNoTracking did not help.

    • Changed type festivis Monday, March 25, 2013 2:22 AM i want to delete it
    Saturday, March 23, 2013 2:40 PM

All replies

  • On 3/23/2013 10:40 AM, festivis wrote:

    I recently moved my entity model from an ObjectContext using 4.1 to a DbContext using 5.0. I'm starting to regret doing that because I'm noticing some very poor performance on query's using the DbContext vs ObjectContext. Here's the test scenario :

    Both contexts use the same database with about 600 tables. LazyLoading and ProxyCreation is turned off for both (not shown in code example). Both have pre-generated views.
    The test first makes 1 call to load up the metadata workspace. Then in a for loop that gets executed 100 times, I new up a context and make one call that takes the first 10. (I'm creating the context inside the for loop because this simulates being used in a WCF service, which would create the context every time)

    for (int i = 0; i < 100; i++)
                             {
                                     using (MyEntities db = new MyEntities())
                                     {
                                             var a = db.MyObject.Take(10).ToList();
                                     }
                             }

    When I run this with the ObjectContext it takes about 4.5 seconds. When I run it using the DbContext it takes about 17 seconds. I profiled this using RedGate's performance profiler. For the DbContext it seems the major culprit is a method called UpdateEntitySetMappings. This is called on every query and appears to retrieve the metadataworkspace and cycle through every item in the OSpace. AsNoTracking did not help.

    The database connection should only be opened onetime and closed onetime with explicit open and close statements, outside of the loop. You are wasting time with "using statement" to open and close the database connection in the loop 100 times.

    http://msdn.microsoft.com/en-us/library/bb896325.aspx

    <copied>

    The Entity Framework exposes the EntityConnection through the Connection property. This enables you to manage the connection and transactions or to supply your own EntityConnection. This is useful when you want to hold open a connection within a short-lived object context to improve performance or to explicitly control transactions. The same provider connection used by the Entity Framework can be shared with other parts of an application. The following example shows how to explicitly open a connection:

    <end>

    The blow is code on how to open and close the connection explicitly.

    open connection

    loop

    close connection
     And every time you call that Linq query, it has to be built from scratch. Maybe you should consider using a compiled Linq query.

    <http://www.codeproject.com/Articles/38174/How-to-improve-your-LINQ-query-performance-by-5-X>
     public class Article :IArticle
        {
            private const string pcdb = "name=PublishingCompanyEntities";
             public DTOArticle GetArticleTop1()
            {
                var dto = new DTOArticle();
                 using (var conn = new EntityConnection(pcdb))
                using (var db = new PublishingCompanyEntities(conn))
                {
                    try
                    {
                        var article = (from a in db.Articles select a).Take(1).First();
                         dto.ArticleID = article.ArticleID;
                        dto.AuthorID =  article.AuthorID;
                        dto.Body = article.Body;
                        dto.Tille = article.Title;
                     }
                    finally
                    {
                        conn.Close();
                    }
                }
                 return dto;
            }

    Saturday, March 23, 2013 8:57 PM
  • Apparently you completed missed this statement :

     (I'm creating the context inside the for loop because this simulates being used in a WCF service, which would create the context every time)

    Sunday, March 24, 2013 11:57 AM
  • On 3/24/2013 7:57 AM, festivis wrote:

    Apparently you completed missed this statement :



      (I'm creating the context inside the for loop because this simulates being used in a WCF service, which would create the context every time)

    Based on what? The connection to the service is closed upon return to the client. I don't see how a loop 100 times simulates opening the connection to a database and closing the connection in a WCF service when the database connection is closed,  and the connection to the service is closed upon return to the client.  Sorry, I don't see the point of the loop.
     Let's say that you were using ADO.NET, in-line T-SQL or a sproc and opening connections in a loop to simulate what, if that was behind the WCF service? You say you are using compiled views, which would negate some of the slowness  of opening a connection working with EF, as EF doesn't have to compile the model every time a connection is made against the the database.

    Sunday, March 24, 2013 12:53 PM
  • Dude,

    #1 The connection has absolutely nothing to do with the problem at hand, the fact that UpdateEntityMappings gets called on every query and is expensive.

    #2 The loop somewhat simulates making 100 WCF calls. A WCF call will create a new context every time. What part of that don't you get? You would not create a static EntityConnection in a WCF call.

    Again, please read the problem again because it has NOTHING to do with connections.

    Sunday, March 24, 2013 3:01 PM
  • On 3/24/2013 11:01 AM, festivis wrote:

    Dude,

    #1 The connection has absolutely nothing to do with the problem at hand, the fact that UpdateEntityMappings gets called on every query and is expensive.

    No, you are wrong here. You say that you are using compiled views, and the mapping of the model is not done every time, because you are using compiled views and the model as been mapped and compiled already, ready for usage.  Where you are losing time here is with the Linq query itself, because the Linq query must be constructed every time.  If the WCF was a self-host WCF service, the compiled query would say in memory. If the compiled query was being done with a WCF Web service, that compiled query would stay in static memory on the Web server until the ASP.NET Worker Process was recycled by doing an IISreset, as an example.


    #2 The loop somewhat simulates making 100 WCF calls. A WCF call will create a new context every time. What part of that don't you get? You would not create a static EntityConnection in a WCF call.

    What I don't understand is what you are doing makes no sense. I have written more than my fair share of WCF services using EF or nHibernate on the backend with them sitting behind the WCF service or maintenance such services with ORM(s) behind them in enterprise level solutions. I have never seen anyone or heard of anyone making some kind of "loop" test and calling it (I am simulating 100 WCF service calls).


    Again, please read the problem again because it has NOTHING to do with connections.

    The point was that everything is disconnected in SOA  with a short-life span, and what you are doing makes no sense. If anything, you would have made a little WCF client/service program with EF behind the service using a test harness and called that service 100 times from the client, which would have more than halfway simulated using WCF with EF behind it  in an integration test. Now that, I would understand.

    I am not trying to get out of line with you, but what you are doing in some kind of testing scenario for speed with WCF and an ORM behind it makes no sense. You are not accomplishing anything with what you are doing is my point, IMHO.

    Sunday, March 24, 2013 7:26 PM
  • Please don't respond any further. You are wrong. You are not helping. Having an EntityConnection outside the loop makes no difference at all (with the problem I'm seeing), I tried it. I'm making 100 calls because 1 isn't a very good sample. Making 100 calls magnifys the difference between the 2. Making 1 call and saying the time difference is .2 secs doesn't seem like a big deal. Over the course of a load, 100 calls, the difference is much more noticeable. Telling me to create the EntityConnection outside the loop does NOTHING, because I can't possibly use it that way in a WCF scenario. I'm very well aware that the query is cached for each app domain and has no bearing here because the test app is essentially in it's own app domain, just as a WCF service is. The whole point of the test is to point out the same query is slower with a DbContext vs an ObjectContext. You are getting out of line with me and now I really need to start a new thread because I'm sure people are not going to want to read this.
    Sunday, March 24, 2013 9:26 PM
  • I'm noticing that my DbContext query's are slower than with an ObjectContext, and it has to do with the creation\initialization of a DbSet vs an ObjectSet.

    Here's the test scenario :

    Both contexts use the same database with about 600 tables. LazyLoading and ProxyCreation is turned off for both. Both have pre-generated views.

    When I make a call with the ObjectContext, it takes on average 42ms to create the ObjectSet. When I make a call with the DbContext, it takes about 140ms to create the internal dbset.

    The next call on the same instance of a DbContext or ObjectContext is quick and about the same, obviously because the ObjectSet and DbSet have been created and initialized already. However, we are using EF in a WCF scenario and the ObjectContext\DbContext is created on overy call, so this ObjectSet\DbSet creation\initialization will happen on every call. A tenth of a second may not seem like a big deal, but over hundreds and thousands of calls, all of which are now a tenth slower, this adds up.

    Both ObjectSet and DbSet do some entityset mapping lookups from the metadataworkspace. What I've noticed is that the DbSet does it for ALL the types in the workspace while the ObjectSet does not. I'm guessing (haven't tried it) that a model with fewer tables that the performance difference is less.

    Is this a known issue?

    • Merged by Alexander Sun Tuesday, March 26, 2013 7:41 AM Duplicate
    Sunday, March 24, 2013 10:52 PM
  • On 3/24/2013 5:26 PM, festivis wrote:

    Please don't respond any further. You are wrong.

    I hope you do get help, because it seems that you need a lot of it.

    Monday, March 25, 2013 12:41 AM