none
Access Violation in .NET 4 Runtime in gc_heap::garbage_collect with no unmanaged modules

    Question

  • Hi all,

    We are experiencing an access violation in the .NET 4.0 runtime, in a piece of code that uses NHibernate heavily. None of our own code interops with native code directly, nor does the 3rd party code we are using (to the best of our knowledge). We cannot identify the memory pattern that is appears to be corrupting the GC heap, and we don't know whether the problem is in the GC itself or in some other piece of code that is corrupting it.

    I've included below the stack trace of the offending thread, and the list of loaded modules.

    We need some help debugging this issue. What should we do next?

    Thanks,
    Adam Smith
    Director, Technology
    Hubbard One
    Thomson Reuters

     

    0:000> !EEStack
    ---------------------------------------------
    Thread   0
    Current frame: clr!WKS::gc_heap::find_first_object+0x64
    ChildEBP RetAddr  Caller, Callee
    0012bf30 791fa5b8 clr!WKS::gc_heap::mark_through_cards_for_segments+0x116, calling clr!WKS::gc_heap::find_card
    0012bf38 791fa618 clr!WKS::gc_heap::mark_through_cards_for_segments+0x563, calling clr!WKS::gc_heap::find_first_object
    0012bfc4 791faaa1 clr!WKS::gc_heap::relocate_phase+0x5b, calling clr!WKS::gc_heap::mark_through_cards_for_segments
    0012c000 791f7a52 clr!WKS::gc_heap::plan_phase+0x851, calling clr!WKS::gc_heap::relocate_phase
    0012c038 791a1be6 clr!HndScanHandlesForGC+0x125, calling clr!_EH_epilog3
    0012c03c 791a2424 clr!Ref_ScanDependentHandlesForClearing+0x61, calling clr!HndScanHandlesForGC
    0012c044 791a2291 clr!SyncBlockCache::GCWeakPtrScan+0x61
    0012c0dc 791f73d1 clr!WKS::gc_heap::gc1+0x140, calling clr!WKS::gc_heap::plan_phase
    0012c100 791f7972 clr!WKS::gc_heap::garbage_collect+0x3ae, calling clr!WKS::gc_heap::gc1
    0012c118 791f6edd clr!WKS::gc_heap::soh_try_fit+0x16b, calling clr!WKS::gc_heap::a_fit_segment_end_p
    0012c184 791f771f clr!WKS::GCHeap::GarbageCollectGeneration+0x17b, calling clr!WKS::gc_heap::garbage_collect
    0012c1a8 791f6b68 clr!WKS::gc_heap::try_allocate_more_space+0x23a, calling clr!WKS::gc_heap::allocate_small
    0012c1b0 791f878a clr!WKS::gc_heap::try_allocate_more_space+0x162, calling clr!WKS::GCHeap::GarbageCollectGeneration
    0012c1d4 791f6b82 clr!WKS::gc_heap::allocate_more_space+0x13, calling clr!WKS::gc_heap::try_allocate_more_space
    0012c1e8 791f6e65 clr!WKS::GCHeap::Alloc+0x3d, calling clr!WKS::gc_heap::allocate_more_space
    0012c208 7919c953 clr!Alloc+0x8d
    0012c224 7916089d clr!SlowAllocateString+0x42, calling clr!Alloc
    0012c258 7919c876 clr!HelperMethodFrame::LazyInit+0x17, calling  (JitHelp: CORINFO_HELP_GET_THREAD)
    0012c264 79160973 clr!FramedAllocateString+0xc9, calling clr!SlowAllocateString
    0012c2ac 791608f6 clr!FramedAllocateString+0x18, calling clr!LazyMachStateCaptureState
    0012c2e0 0703bec0 (MethodDesc 06ba8e80 +0xf0 NHibernate.Engine.Cascade.DeleteOrphans(System.String, NHibernate.Collection.IPersistentCollection)), calling clr!JIT_IsInstanceOfInterface
    0012c300 79b3781c (MethodDesc 798fbdcc +0x7c System.String.Concat(System.String, System.String)), calling 00972350
    0012c318 06bcadf0 (MethodDesc 06ba8e74 +0x2f0 NHibernate.Engine.Cascade.CascadeCollectionElements(System.Object, NHibernate.Type.CollectionType, NHibernate.Engine.CascadeStyle, NHibernate.Type.IType, System.Object, Boolean)), calling (MethodDesc 798fbdcc +0 System.String.Concat(System.String, System.String))
    0012c34c 06bcaab9 (MethodDesc 06ba8e5c +0x99 NHibernate.Engine.Cascade.CascadeCollection(System.Object, NHibernate.Engine.CascadeStyle, System.Object, NHibernate.Type.CollectionType)), calling (MethodDesc 06ba8e74 +0 NHibernate.Engine.Cascade.CascadeCollectionElements(System.Object, NHibernate.Type.CollectionType, NHibernate.Engine.CascadeStyle, NHibernate.Type.IType, System.Object, Boolean))
    0012c380 06bcaa07 (MethodDesc 06ba8e50 +0x77 NHibernate.Engine.Cascade.CascadeAssociation(System.Object, NHibernate.Type.IType, NHibernate.Engine.CascadeStyle, System.Object, Boolean)), calling (MethodDesc 06ba8e5c +0 NHibernate.Engine.Cascade.CascadeCollection(System.Object, NHibernate.Engine.CascadeStyle, System.Object, NHibernate.Type.CollectionType))
    0012c3a0 06bc7f7d (MethodDesc 06ba8e2c +0x4d NHibernate.Engine.Cascade.CascadeProperty(System.Object, NHibernate.Type.IType, NHibernate.Engine.CascadeStyle, System.Object, Boolean)), calling (MethodDesc 06ba8e50 +0 NHibernate.Engine.Cascade.CascadeAssociation(System.Object, NHibernate.Type.IType, NHibernate.Engine.CascadeStyle, System.Object, Boolean))
    0012c3c4 06bc7c99 (MethodDesc 06ba8e20 +0x159 NHibernate.Engine.Cascade.CascadeOn(NHibernate.Persister.Entity.IEntityPersister, System.Object, System.Object)), calling (MethodDesc 06ba8e2c +0 NHibernate.Engine.Cascade.CascadeProperty(System.Object, NHibernate.Type.IType, NHibernate.Engine.CascadeStyle, System.Object, Boolean))
    0012c40c 06bcb8a4 (MethodDesc 04b699a0 +0x64 NHibernate.Event.Default.AbstractFlushingEventListener.CascadeOnFlush(NHibernate.Event.IEventSource, NHibernate.Persister.Entity.IEntityPersister, System.Object, System.Object)), calling (MethodDesc 06ba8e20 +0 NHibernate.Engine.Cascade.CascadeOn(NHibernate.Persister.Entity.IEntityPersister, System.Object, System.Object))
    0012c440 06bcb793 (MethodDesc 04b69994 +0xf3 NHibernate.Event.Default.AbstractFlushingEventListener.PrepareEntityFlushes(NHibernate.Event.IEventSource)), calling (MethodDesc 04b699a0 +0 NHibernate.Event.Default.AbstractFlushingEventListener.CascadeOnFlush(NHibernate.Event.IEventSource, NHibernate.Persister.Entity.IEntityPersister, System.Object, System.Object))
    0012c47c 06bcb388 (MethodDesc 04b69968 +0x88 NHibernate.Event.Default.AbstractFlushingEventListener.FlushEverythingToExecutions(NHibernate.Event.FlushEvent)), calling (MethodDesc 04b69994 +0 NHibernate.Event.Default.AbstractFlushingEventListener.PrepareEntityFlushes(NHibernate.Event.IEventSource))
    0012c4b0 06bcb24d (MethodDesc 04b69b74 +0x2d NHibernate.Event.Default.DefaultFlushEventListener.OnFlush(NHibernate.Event.FlushEvent))
    0012c4c4 06bcb195 (MethodDesc 02af3330 +0xd5 NHibernate.Impl.SessionImpl.Flush()), calling 057f5e72

                Base TimeStamp                     Module
              400000 48ebcdd3 Oct 07 17:00:03 2008 X:\ContactNet\bin\NAnt.exe
            7c800000 49900d60 Feb 09 06:02:56 2009 C:\WINDOWS\system32\ntdll.dll
            79000000 4af3af84 Nov 06 00:09:24 2009 C:\WINDOWS\system32\mscoree.dll
            77e40000 49c51f0a Mar 21 13:08:26 2009 C:\WINDOWS\system32\KERNEL32.dll
            7d1e0000 4a61f120 Jul 18 11:58:24 2009 C:\WINDOWS\system32\ADVAPI32.dll
            77c50000 49f5889a Apr 27 06:27:38 2009 C:\WINDOWS\system32\RPCRT4.dll
            76f50000 4a3742b4 Jun 16 02:59:00 2009 C:\WINDOWS\system32\Secur32.dll
            603b0000 4ba1d8a9 Mar 18 03:39:21 2010 C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\mscoreei.dll
            77da0000 45d70ac0 Feb 17 09:01:36 2007 C:\WINDOWS\system32\SHLWAPI.dll
            77c00000 4900637a Oct 23 07:43:54 2008 C:\WINDOWS\system32\GDI32.dll
            77380000 45e7c676 Mar 02 01:38:46 2007 C:\WINDOWS\system32\USER32.dll
            77ba0000 45d70b06 Feb 17 09:02:46 2007 C:\WINDOWS\system32\msvcrt.dll
            76290000 45d70a5f Feb 17 08:59:59 2007 C:\WINDOWS\system32\IMM32.DLL
            79140000 4ba1d9ef Mar 18 03:44:47 2010 C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\clr.dll
            79060000 4ba1dbf2 Mar 18 03:53:22 2010 C:\WINDOWS\system32\MSVCR100_CLR0400.dll
            79880000 4ba1da6f Mar 18 03:46:55 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\mscorlib\246f1a5abb686b9dcdf22d3505b08cea\mscorlib.ni.dll
            77670000 45d70aa5 Feb 17 09:01:09 2007 C:\WINDOWS\system32\ole32.dll
            4b3c0000 45d70ab2 Feb 17 09:01:22 2007 C:\WINDOWS\system32\MSCTF.dll
            60340000 4ba1d8aa Mar 18 03:39:22 2010 C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\culture.dll
            60930000 4ba1d8ae Mar 18 03:39:26 2010 C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\nlssorting.dll
            79810000 4ba1da36 Mar 18 03:45:58 2010 C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\clrjit.dll
            68000000 45d69786 Feb 17 00:49:58 2007 C:\WINDOWS\system32\rsaenh.dll
            76b70000 45d70ab5 Feb 17 09:01:25 2007 C:\WINDOWS\system32\PSAPI.DLL
             32e0000 45d69418 Feb 17 00:35:20 2007 C:\WINDOWS\system32\xpsp2res.dll
            77b90000 45d70ac8 Feb 17 09:01:44 2007 C:\WINDOWS\system32\VERSION.dll
            7a820000 4ba1dff4 Mar 18 04:10:28 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System\964da027ebca3b263a05cadb8eaa20a3\System.ni.dll
            60c90000 4ba1e04b Mar 18 04:11:55 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Configuration\ac18c2dcd06bd2a0589bac94ccae5716\System.Configuration.ni.dll
            69720000 4ba1dfec Mar 18 04:10:20 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Xml\e997d0200c25f7db6bd32313d50b729d\System.Xml.ni.dll
            6f350000 4a4eaae7 Jul 03 21:05:43 2009 C:\WINDOWS\system32\urlmon.dll
            77d00000 4760e409 Dec 13 02:49:29 2007 C:\WINDOWS\system32\OLEAUT32.dll
            40a90000 4a4eaae8 Jul 03 21:05:44 2009 C:\WINDOWS\system32\iertutil.dll
            77420000 45d70a05 Feb 17 08:58:29 2007 C:\WINDOWS\WinSxS\x86_Microsoft.Windows.Common-Controls_6595b64144ccf1df_6.0.3790.3959_x-ww_D8713E55\comctl32.dll
            7c8d0000 485819a8 Jun 17 16:08:08 2008 C:\WINDOWS\system32\SHELL32.dll
            44f20000 4ba1e740 Mar 18 04:41:36 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\Microsoft.Build.Fra#\11ef4be6ee227fce3725d6df534297a4\Microsoft.Build.Framework.ni.dll
            60e50000 4ba1dfee Mar 18 04:10:22 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Core\713647b987b140a17e3c4ffe4c721f85\System.Core.ni.dll
            67160000 4ba1e0c5 Mar 18 04:13:57 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Web\a70842538614699d690561ef5f43598b\System.Web.ni.dll
            5e0d0000 4ba2183b Mar 18 08:10:35 2010 C:\WINDOWS\Microsoft.NET\Framework\v4.0.30319\diasymreader.dll
            60650000 4ba1db8f Mar 18 03:51:43 2010 C:\WINDOWS\Microsoft.Net\assembly\GAC_32\ISymWrapper\v4.0_4.0.0.0__b03f5f7f11d50a3a\ISymWrapper.dll
            61750000 4ba1e064 Mar 18 04:12:20 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Data\92cccedc7cda413ff6fc6492cb256b58\System.Data.ni.dll
             4e00000 4ba1e064 Mar 18 04:12:20 2010 C:\WINDOWS\Microsoft.Net\assembly\GAC_32\System.Data\v4.0_4.0.0.0__b77a5c561934e089\System.Data.dll
            71c00000 45d70ae9 Feb 17 09:02:17 2007 C:\WINDOWS\system32\WS2_32.dll
            71bf0000 45d70aea Feb 17 09:02:18 2007 C:\WINDOWS\system32\WS2HELP.dll
            761b0000 45d70a80 Feb 17 09:00:32 2007 C:\WINDOWS\system32\CRYPT32.dll
            76190000 45d70aac Feb 17 09:01:16 2007 C:\WINDOWS\system32\MSASN1.dll
            766d0000 45d70abc Feb 17 09:01:32 2007 C:\WINDOWS\system32\shfolder.dll
            666d0000 4ba1e09d Mar 18 04:13:17 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Transactions\dd9dbf82e44454689976a49a9e4ddb6d\System.Transactions.ni.dll
            66680000 4ba1e09d Mar 18 04:13:17 2010 C:\WINDOWS\Microsoft.Net\assembly\GAC_32\System.Transactions\v4.0_4.0.0.0__b77a5c561934e089\System.Transactions.dll
            65d70000 4ba1df90 Mar 18 04:08:48 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.EnterpriseSe#\8b6e9d6171aad3561263ce2cd05c57df\System.EnterpriseServices.ni.dll
            10020000 4ba1db80 Mar 18 03:51:28 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.EnterpriseSe#\8b6e9d6171aad3561263ce2cd05c57df\System.EnterpriseServices.Wrapper.dll
            10000000 4ba1db80 Mar 18 03:51:28 2010 C:\WINDOWS\Microsoft.Net\assembly\GAC_32\System.EnterpriseServices\v4.0_4.0.0.0__b03f5f7f11d50a3a\System.EnterpriseServices.Wrapper.dll
            71f60000 3e8024bc Mar 25 05:43:24 2003 C:\WINDOWS\system32\security.dll
            76750000 4a3742b4 Jun 16 02:59:00 2009 C:\WINDOWS\system32\schannel.dll
            76920000 45d70ac8 Feb 17 09:01:44 2007 C:\WINDOWS\system32\USERENV.dll
            71c40000 48f7bdc3 Oct 16 18:18:43 2008 C:\WINDOWS\system32\NETAPI32.dll
            48060000 434f63fa Oct 14 03:53:30 2005 C:\Program Files\Microsoft SQL Server\90\Shared\instapi.dll
            78130000 4889d619 Jul 25 09:33:13 2008 C:\WINDOWS\WinSxS\x86_Microsoft.VC80.CRT_1fc8b3b9a1e18e3b_8.0.50727.3053_x-ww_B80FA8CA\MSVCR80.dll
            68100000 45d6978b Feb 17 00:50:03 2007 C:\WINDOWS\system32\dssenh.dll
            66230000 4ba1dfda Mar 18 04:10:02 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Numerics\b07f0d26a34ad53fc369248f289d1126\System.Numerics.ni.dll
            66510000 4ba1df78 Mar 18 04:08:24 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Security\09a97525ae5583cc2685e2c39a3078bd\System.Security.ni.dll
            7b1d0000 4ba1e086 Mar 18 04:12:54 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Drawing\dd57bc19f5807c6dbe8f88d4a23277f6\System.Drawing.ni.dll
            68df0000 4ba1e1a5 Mar 18 04:17:41 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Web.Services\149f2dcb9c9706e592d1980a945850c2\System.Web.Services.ni.dll
            51860000 4ba1f498 Mar 18 05:38:32 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.ServiceModel\250b525aa8c17327216e102569c0d766\System.ServiceModel.ni.dll
            66610000 4ba1e143 Mar 18 04:16:03 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.ServiceProce#\6e7f1bdc845816dfc797f8002b76b5e8\System.ServiceProcess.ni.dll
            52d30000 4ba1f585 Mar 18 05:42:29 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.ServiceModel#\51c60db370e050d9cdcac17060aaac53\System.ServiceModel.Web.ni.dll
            51130000 4ba1f437 Mar 18 05:36:55 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Runtime.Seri#\e9f8a45b1063d6c6a62718c88a5623d1\System.Runtime.Serialization.ni.dll
            63d00000 4ba1e082 Mar 18 04:12:50 2010 C:\WINDOWS\assembly\NativeImages_v4.0.30319_32\System.Data.OracleC#\db33744fb49e77c7233adb50f07fe62a\System.Data.OracleClient.ni.dll
            63c80000 4ba1e082 Mar 18 04:12:50 2010 C:\WINDOWS\Microsoft.Net\assembly\GAC_32\System.Data.OracleClient\v4.0_4.0.0.0__b77a5c561934e089\System.Data.OracleClient.dll
    Wednesday, September 22, 2010 1:53 PM

All replies

  • Hi:

    This isn't really an issue Microsoft can assist with on the public forums. You can create a support incident on this and Microsoft Developer Support can assist using tools and techniques to further diagnose the memory corruption.

     

    The support that can be provided through the forums is currently limited.  Your question falls into the paid support category which requires a more in-depth level of support.  Please visit the below link to see the various paid support options that are available to better meet your needs. http://support.microsoft.com/default.aspx?id=fh;en-us;offerprophone

    Wendell

    Thursday, September 30, 2010 7:59 PM
  • Can you repro it? How long does it take to repro it?

    You could take some memory dumps when this happens to find out if other stacks are at awkward places. You can do !VerifyHeap -v to find out which objects are corrupted.

    Then you can use !ListNearObj to find out which objects come before and after to find perhaps the offending memory overwriter.

    If no pattern does shine through you can enable GC Stress log which can be examined with various clr commands.

    e.g.

    DumpLog [-addr <addressOfStressLog >] [<Filenam e >]

    Writes the contents of an in-memory stress log to the specified file. If you do not specify a name, this command creates a file called StressLog.txt in the current directory.

    The in-memory stress log helps you diagnose stress failures without using locks or I/O. To enable the stress log, set the following registry keys under HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\.NETFramework:

    (DWORD) StressLog = 1

    (DWORD) LogFacility = 0xffffffff

    (DWORD) StressLogSize = 65536

    The optional -addr option lets you specify a stress log other than the default log.

     

    You can use 1,2 or 3 as StressLog values to trigger GCs after every object allocation with full call stack recording that can help if you can repro the problem. Otherwise you need to become more creative ;-).

    Yours,

      Alois Kraus

     

    Friday, October 01, 2010 10:09 AM
  • Hi,

     

    We are seeing the same problem, with much the same environment:

     

    Windows Server 2008 R2 Standard.  Version reported as: Microsoft Windows [Version 6.1.7600]

    .NET Oracle drivers (client):  11.2.0.1.0-64bit.  Oracle.DataAccess.DLL file version:  4.112.2.0

    The crash occurs at the same line in managed code each time:

    _appSession.NHibernateSession.Flush();

    The native code stack trace is:

    RetAddr           : Args to Child                                                           : Call Site

    000007fe`f9584084 : 00000000`005ccc20 00000000`28d8c390 00000000`30404680 00000000`00000034 : clr!WKS::gc_heap::find_first_object+0xea

    000007fe`f9585a17 : 000007fe`f9585a70 00000000`00000001 00000000`00000001 000007fe`00000000 : clr!WKS::gc_heap::mark_through_cards_for_segments+0x1fd

    000007fe`f95832ae : 00000000`304046d0 00000000`28d8c560 00000000`30804318 00000000`00000002 : clr!WKS::gc_heap::relocate_phase+0x63

    000007fe`f95809da : 00000000`00000000 00000000`00000001 000007fe`00000001 000007fe`00000000 : clr!WKS::gc_heap::plan_phase+0x819

    000007fe`f95812d5 : 000013ea`d4f94b75 00000000`28d8c6e9 00000000`00000000 00000000`00000001 : clr!WKS::gc_heap::gc1+0xbb

    000007fe`f9580f8e : 00000000`005d3510 000007fe`00000000 00000000`00000000 00000000`00000000 : clr!WKS::gc_heap::garbage_collect+0x276

    000007fe`f957f77e : 00000000`00000028 00000000`00000000 000007fe`f9400000 000007fe`eeea866d : clr!WKS::GCHeap::GarbageCollectGeneration+0x14e

    000007fe`f957f4ae : 00000000`28d8c940 00000000`00000000 00000000`1bb12c80 00000000`00000028 : clr!WKS::gc_heap::try_allocate_more_space+0x25f

     


     

    Monday, January 31, 2011 12:21 PM
  • I've got a similar issue. My call stack looks like this. For me, it's a Windows Service, and I get this stack from a memory dump file (mdmp). 

    	clr.dll!WKS::gc_heap::mark_through_cards_for_segments() + 0x237 bytes	
     	[Managed to Native Transition]	
     	NHibernate.dll!NHibernate.Util.IdentityMap.EntryList.get() + 0xf2 bytes	
     	NHibernate.dll!NHibernate.Event.Default.AbstractFlushingEventListener.FlushCollections(NHibernate.Event.IEventSource session) + 0x24b bytes	
     	NHibernate.dll!NHibernate.Event.Default.AbstractFlushingEventListener.FlushEverythingToExecutions(NHibernate.Event.FlushEvent event) + 0x11e bytes	
     	NHibernate.dll!NHibernate.Event.Default.DefaultFlushEventListener.OnFlush(NHibernate.Event.FlushEvent event) + 0x53 bytes	
     	NHibernate.dll!NHibernate.Impl.SessionImpl.Flush() + 0x1fd bytes	
    	...
    	[ my code ending up calling session.Flush();]

     

    Another time, I got a slightly different call stack:

     	clr.dll!WKS::gc_heap::find_first_object() + 0x87 bytes	
     	[Managed to Native Transition]	
     	mscorlib.dll!System.Text.StringBuilder.ToString() + 0x1e bytes	
     	NHibernate.dll!NHibernate.Impl.MessageHelper.InfoString(NHibernate.Persister.Collection.ICollectionPersister persister, object id, NHibernate.Engine.ISessionFactoryImplementor factory) + 0xbd bytes	
     	NHibernate.dll!NHibernate.Engine.Collections.ProcessReachableCollection(NHibernate.Collection.IPersistentCollection collection = {NHibernate.Collection.Generic.PersistentGenericBag<ClearLife.ClariNet.Entities.AMBest.Rating>}, NHibernate.Type.CollectionType type, object entity, NHibernate.Engine.ISessionImplementor session = {NHibernate.Impl.SessionImpl}) + 0x108 bytes	
     	NHibernate.dll!NHibernate.Event.Default.FlushVisitor.ProcessCollection(object collection, NHibernate.Type.CollectionType type) + 0x61 bytes	
     	NHibernate.dll!NHibernate.Event.Default.AbstractVisitor.ProcessValue(object value, NHibernate.Type.IType type) + 0x3e bytes	
     	NHibernate.dll!NHibernate.Event.Default.AbstractVisitor.ProcessValue(int i, object[] values, NHibernate.Type.IType[] types) + 0x22 bytes	
     	NHibernate.dll!NHibernate.Event.Default.AbstractVisitor.ProcessEntityPropertyValues(object[] values, NHibernate.Type.IType[] types = {NHibernate.Type.IType[7]}) + 0x36 bytes	
     	NHibernate.dll!NHibernate.Event.Default.DefaultFlushEntityEventListener.OnFlushEntity(NHibernate.Event.FlushEntityEvent event) + 0x107 bytes	
     	NHibernate.dll!NHibernate.Event.Default.AbstractFlushingEventListener.FlushEntities(NHibernate.Event.FlushEvent event) + 0x127 bytes	
     	NHibernate.dll!NHibernate.Event.Default.AbstractFlushingEventListener.FlushEverythingToExecutions(NHibernate.Event.FlushEvent event) + 0xab bytes	
     	NHibernate.dll!NHibernate.Event.Default.DefaultFlushEventListener.OnFlush(NHibernate.Event.FlushEvent event) + 0x2d bytes	
     	NHibernate.dll!NHibernate.Impl.SessionImpl.Flush() + 0x125 bytes	
    


    My Event Log has Errors, one from .Net Runtime:

     

     

    Application: XXX.Service.exe
    
    Framework Version: v4.0.30319
    
    Description: The process was terminated due to an internal error in the .NET Runtime at IP 000007FEDEB4BD07 (000007FEDE9D0000) with exit code 80131506.

    And another from Application Error, and a System one "The XXX service terminated unexpectedly.  It has done this 1 time(s).".

     

    I followed instructions on debugging the managed heap, but got lost in there, so stopped.

     

    Wednesday, June 29, 2011 3:05 PM
  • I got the same issue:

    CLR!WKS::GC_HEAP::FIND_FIRST_OBJECT+BBIn MINIDUMP_FirstChance_av_AccessViolation_Service.exe__090c_2011-08-10_06-13-56-747_1450.dmp the assembly instruction at clr!WKS::gc_heap::find_first_object+bb in C:\Windows\Microsoft.NET\Framework\v4.0.30319\clr.dll from Microsoft Corporation has caused an access violation exception (0xC0000005) when trying to read from memory location 0x00000030 on thread 11

    couldn't find any clue so far....

    Wednesday, August 10, 2011 9:41 AM
  • Hello,

    I have the same problem, same stack at the top re. GC. Did you find a way to find the cause?

    Monday, September 26, 2011 11:13 AM
  • Hi folks,

    This is a known issue which will be fixed in the next version of the .NET Framework. As a work around you can disable background GC.

    Regards,
    Lee


    Lee
    Monday, September 26, 2011 9:05 PM
  • Hi folks,

    This is a known issue which will be fixed in the next version of the .NET Framework. As a work around you can disable background GC.

    Regards,
    Lee


    Lee
    Do you have any details on where it's stated that this is a known issue? - I have not been able to find any details on such a issue and we are having the same problems.
    Wednesday, October 05, 2011 8:33 PM
  • Do you mean this is fixed in 4.5 or 5?
    Thursday, December 22, 2011 6:30 PM
  • Lee Coward - can you please confirm release this has been / will be fixed under? Is there a KB number for the issue?

    Are there any side effects of disabling the background (I think this is called concurrent) GC? Would it be better to just use Server GC?

    There are a few related issues to this - would be good to see the BK / fix reference.

    http://stackoverflow.com/questions/8668231/windbg-hunting-exceptions-that-have-caused-a-net-service-to-crash

    Tuesday, February 28, 2012 3:38 AM
  • My company raised a paid support issue with Microsoft and they are (apparently) preparing a KB article about this. It should be fixed in the next version of the .NET runtime. We encountered this issue around Christmas 2010.

    The workaround we were advised to use was to change the GC flags (sorry, don't have the code as I'm writing this at home).

    To be honest, we would have liked to have seen more from Microsoft on this but the workaround did work.


    Monday, March 05, 2012 8:37 PM
  • The knowledge base article was released today, with details of how to deploy the workaround. The link is:

    http://support.microsoft.com/kb/2679415

    Tuesday, March 06, 2012 11:45 AM
  • Hi Adam,

    As suggested by Anonymous93kd, change the GC flag in your config file as:

    <configuration>
       <runtime>
           <gcConcurrent enabled="false"/>
       </runtime>
    </configuration>
    Hopefully it should work.

    Regards, http://shwetamannjain.blogspot.com

    Thursday, March 22, 2012 10:25 AM
  • I found that .Net 4.5 still having this issue, my clr.dll veriosn is 4.0.30319.17929. Does anyone got any hotfix / patch yet?

    Updated:

    The workaround that provided by Shweta Jain not working for me at all. After used the workaround, the error will be ocurrs after a few tries.


    Anson


    • Edited by Anson Woo Tuesday, January 22, 2013 8:16 AM The workaround not working at all
    Monday, January 21, 2013 8:57 AM
  • We have been so-far unsuccessful tracking down the cause of managed heap corruption that we have been tracking for the last 9 months.  However, I have noticed on the extremely rare occasion that while the debugger is attached, it will break on an access violation in IOCompletionCallback.

    Considering that I recently tracked down an issue with SplashScreen passing down a delegate to a native window proc that had the potential to be GC'd while the process is shutting down (and subsequently causing an access violation if the window proc is invoked within that narrow window), I am beginning to believe that there is an IO Completion that is registered and the delegate is not properly being rooted leading to a potential callback into a GC'd delegate which could manifest as an access violation or worse memory corruption if it actually executes code.

    Thursday, April 18, 2013 5:18 PM
  • So far the issue seems not be fixed with runtime update. We update the .net runtime to newest 4.5.1 version but the .net application still crashs with Access Violation exception. We try to reproduce on different environments/servers to verify maybe so environmental settings causing this. The application only crashs if

    - GC server mode is enabled (GCConcurrent is disabled)

    - running on VM with static memory assignment (never crashs on "real" servers )

    - the application consums more than 16 GB of memory (our service loads a big amount of data which cause memory consumption up to 40 GB) - it never crash if service takes less than 10 GB

    - crash occurrs on very high load - means GC needs to run to clean up

    If we set to workstation GC the crash doesn't happen but this isn't a solution because of very bad performance of workstation GC.

    We enabled heap verification and running with attached debugger but the application still crashs with AccessViolation at following stack:

    RetAddr           : Args to Child                                                           : Call Site
    00000000`775d41bb : 00000000`00000000 00000000`00000000 00000000`80131506 00000000`80131506 : ntdll!ZwTerminateProcess+0xa
    000007fe`f9e7ccee : 00000000`00000000 00000000`00000000 00000000`027fefd0 00000000`00000001 : ntdll!RtlExitUserProcess+0x9b
    000007fe`f9e7ce64 : 00000000`00000000 00000000`00000000 ffffffff`00000000 00000000`00000000 : mscoreei!RuntimeDesc::ShutdownAllActiveRuntimes+0x294
    000007fe`f96bb379 : 00000000`0035db28 000007fe`f9b6e760 00000000`80131506 00000000`00000000 : mscoreei!CLRRuntimeHostInternalImpl::ShutdownAllRuntimesThenExit+0x14
    000007fe`f960df9d : 00000000`80131506 00000000`0035daf8 00000000`00000040 00000000`00000001 : clr!EEPolicy::ExitProcessViaShim+0x69
    000007fe`f983e7e1 : 00000000`80131506 000007fe`f94add56 00000000`80131506 00000000`00000009 : clr!SafeExitProcess+0x9d
    000007fe`f9ab825e : 00000000`00000000 00000000`00000000 00000000`0035f0b0 00000000`776e7358 : clr!EEPolicy::HandleFatalError+0x12a
    000007fe`f9663c40 : 00000000`00000000 00000000`776e7358 00000000`0035f0b0 00000000`00000000 : clr! ?? ::FNODOBFM::`string'+0x5060c
    000007fe`f9663c07 : 00000000`00000000 00000000`00000000 00000000`00000000 ffffffff`fffffffe : clr!CLRVectoredExceptionHandlerPhase2+0x2d
    000007fe`f9663b45 : 00000000`00000000 00000000`00000000 00000000`776e7358 00000000`00000000 : clr!CLRVectoredExceptionHandler+0x94
    00000000`775ca5db : 00000000`02cc6ff0 00000000`776e7350 00000000`776e7350 00000000`00000000 : clr!CLRVectoredExceptionHandlerShim+0x85
    00000000`775c8e62 : 00000000`0322a1a0 00000000`00000000 00000000`00000000 00000000`02cc6fe0 : ntdll!vsprintf_s+0x137
    00000000`77601248 : 00000000`00000000 00000000`00000000 00000000`00000000 00000000`00000000 : ntdll!RtlUnwindEx+0x852
    000007fe`f96c9f73 : 00000000`0035f480 00000000`00000002 00000000`00000001 00000000`00000002 : ntdll!KiUserExceptionDispatcher+0x2e
    000007fe`f969a517 : 00000002`7fff0000 00000000`00000000 00000003`2e73be48 00000002`7fff1000 : clr!SVR::gc_heap::compact_phase+0x1a7
    000007fe`f9699e70 : 00000000`00000001 00000000`00000002 00000000`00000002 00000000`00000002 : clr!SVR::gc_heap::plan_phase+0x9a4
    000007fe`f969a14d : 00000000`0322cb38 00000000`0035f6c0 00000000`0322cb38 00000000`00000000 : clr!SVR::gc_heap::gc1+0xa6
    000007fe`f969982c : 00000000`0322a1a0 00000000`0322a1a0 00000000`00000000 00000000`00000060 : clr!SVR::gc_heap::garbage_collect+0x357
    000007fe`f9605987 : 00000000`0322a1a0 00000000`00001e00 00000000`00000000 000007fe`fd45acc2 : clr!SVR::gc_heap::gc_thread_function+0x6c
    00000000`774a59ed : 00000000`00000000 00000000`00000000 00000000`00000000 00000000`00000000 : clr!SVR::gc_heap::gc_thread_stub+0x7a
    00000000`775dc541 : 00000000`00000000 00000000`00000000 00000000`00000000 00000000`00000000 : kernel32!BaseThreadInitThunk+0xd
    00000000`00000000 : 00000000`00000000 00000000`00000000 00000000`00000000 00000000`00000000 : ntdll!RtlUserThreadStart+0x21

    Wednesday, May 07, 2014 8:10 AM
  • Hi,

    From the callstack and symptoms you describe, this is most likely managed heap corruption. It is almost certainly NOT the same issue for which this thread started.

    Managed heap corruptions can be caused by variety of things - native code in your app/library corrupting memory, invalid PInvoke in your app/library corrupting memory, HW problem corrupting memory, or bug in .NET Framework or OS. The list is in order of likelihood based on our experience (chance of a bug in .NET/OS is the lowest).

    Tracking down issues like this may be very difficult. If you have reliable repro and can collect heap dumps, you have slightly better chance. First rule out HW issues and make sure it is a problem on more than 1 machine, ideally with different setup (to avoid driver bugs like here - see S. McGuire's reply on November 10, 2008).

    Second I would suggest to verify that your problem is really managed heap corruption (open crash dump in windbg and run !VerifyHeap). If you see failures, you can look at nearby objects "!lno <address>". Look for patterns across dumps. (Rather tedious exercise)

    If that doesn't help, the investigation becomes even more difficult, time consuming and with low chance of success.

    There are also some good news: Just yesterday we publicly released .NET 4.5.2. We have added optional instrumentation and logging into it to simplify tracking down the most common subclass of managed heap corruption issues (the case when native code holds on to managed object passed via PInvoke that is not pinned). We are now finishing final tuning of tool that processes the logs and provides heuristics of most likely cause of the corruption.
    WARNING: The logging has unfortunately still some limitations and does not catch all classes of managed heap corruptions (like unsafe C# code corrupting stack or heap, or manual marshalling of objects via IntPtr, etc.).
    If you can confirm that you have managed heap corruption (!HeapVerify in windbg), you can upgrade to 4.5.2 and you are willing to use not-yet-fully-finished tool, feel free to send me email at karel.zikmund@you-know-where.com and we can see if the new logging may help you track down your problem (be warned: no guarantees it will help for sure).

    -Karel Zikmund
    Developer on CLR team

    Wednesday, May 07, 2014 5:40 PM
  • Hi, Thanks for your reply. Sorry for mismatchin thread issue, if possible we can move reply to own thread. We already tried to reproduce on several servers ( VM and real servers) and also with different setups, - Server 2008 or Server 2012 - 16GB and 64 GB - static or dynamic memory on VM - Using pagefu or not It can be reproduced and all VMs with a memory consumption greater than 16 GB, but never on real servers. Maybe its related to combination of VM and GC. We also enabled heap verification (gflags full) and create crash dumps. !verifyheap results in a set of invalid entries/failures but at quite different object types (mean different code stacks) so we cannot se direct code reason. !Clrstack return that it cannot walk unmanged thread. Currently we try to reduce service code step by step and check if we maybe find code issue. But we don't use unmanged code or API calls, except maybe a 3rd party does. But there are only protobuf and mongo.net driver except the framework itself. We will try upgrade to 4.5.2. Maybe the new Features may help us.
    Wednesday, May 07, 2014 6:23 PM
  • Hi,

    we did some additional tests and maybe we find the issue but cannot explain what the correct reason is. If we enable heap verification we recognized that the heap becomes corrupt during startup of our service (which loads a very high amount of data using different formats, e.g. text, protobuf, json etc.). Everytime the AccessViolation was caused by a GC thread itself during memcopy or compact phase.

    So we did some deeper investigation and find out that the Parallel.For loop we using in our startup routine may cause the issue. It starts a very high amount of threads which consuming high amount of memory. Seems that GC server cannot handle this. If we use Task and TaskFactory instead and limit the number of tasks to the number of available cores the Heap corruption doesn't happen (we retried several times). If we use a very high number of threads there is also a heap verification same if we use parallel.for with ParallelOptions set to 4 threads. Only in case of 4 tasks (number of available cores) this doesn't happen.

    Seems that GC server has an issue with multiple threads using very high amount of heap (> 4 GB).

    Friday, May 09, 2014 2:14 PM
  • Hi,

    Your replies don't change my original thoughts and prioritization of issue likelihood. Here's why: Managed heap corruptions can happen when object is not pinned, yet native code holds on it and GC happens in the meantime. It can be very little window when native code holds onto the managed object reference, and that's why the corruption can be very rare to reproduce (depends if GC happens in that time or not).

    Application startup tends to be more stable and reproducible. Given that your Parallel.For allocates a lot means that it creates certain allocation pattern which affects when GC happens (once you hit some allocation threshold). If you replace it with different implementation (like Task) which has different allocation pattern, timing of GC will change and it may skip the short window when the buggy code holds on to managed object - so corruption is 'avoided'.

    Server GC is used on many, many, many servers under variety of loads and variety of GC heap sizes (ranging to 10s of GBs). Those servers are behind many Microsoft services (Halo servers, SharePoint online, Exchange, Bing to name just a few from top of my mind), many Enterprise servers (banks, Fortune500 companies) and many other servers. All those customers tend to look into every crash in their servers and if they are caused by CLR bugs (or there is reasonable suspicion), they get routed to my team (through Customer support channel). The frequency of such CLR bugs is very small. The frequency of managed heap corruptions that are caused by user code is however much higher. That said, there is a non-zero chance that you are the unlucky one to hit a unique CLR bug that no one else hit so far ... but without deeper investigation it's impossible to say.

    Thanks,
    -Karel

    Friday, May 09, 2014 7:51 PM
  • Hello,

    we still investigate the AccessViolation which causes crashing C# service on Virtual maschines with high amount of memory. Using some tools we identified some unsafe code blocks in 3rd party.

    Using peverify:

    - for protobuf-net

    peverify E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\protobuf-net.dll /verbose
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\protobuf-net.dll : ProtoBuf.ProtoReader::ReadDouble][mdToken=0x6000297][offset 0x00000
    020][found address of Long] Expected numeric type on the stack.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\protobuf-net.dll : ProtoBuf.ProtoReader::ReadSingle][mdToken=0x60002a3][offset 0x00000
    018][found address of Int32] Expected numeric type on the stack.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\protobuf-net.dll : ProtoBuf.ProtoWriter::WriteDouble][mdToken=0x60002eb][offset 0x0000
    0032][found address of Double] Expected numeric type on the stack.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\protobuf-net.dll : ProtoBuf.ProtoWriter::WriteSingle][mdToken=0x60002ec][offset 0x0000
    0011][found address of Single] Expected numeric type on the stack.
    4 Error(s) Verifying E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\protobuf-net.dll


    - for Common.Logging.Log4net

    peverify E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll  /verbose
    Microsoft (R) .NET Framework PE Verifier.  Version  4.0.30319.1
    Copyright (c) Microsoft Corporation.  All rights reserved.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::set_Layout]
    [mdToken=0x6000004][offset 0x00000006] Unable to resolve token.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::Append][mdT
    oken=0x6000005] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::.cctor][mdT
    oken=0x6000001][offset 0x00000022] Unrecognized arguments for delegate .ctor.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::.cctor][mdT
    oken=0x6000001][offset 0x0000004E] Unrecognized arguments for delegate .ctor.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::.cctor][mdT
    oken=0x6000001][offset 0x0000007A] Unrecognized arguments for delegate .ctor.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::.cctor][mdT
    oken=0x6000001][offset 0x000000A6] Unrecognized arguments for delegate .ctor.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::.cctor][mdT
    oken=0x6000001][offset 0x000000D2] Unrecognized arguments for delegate .ctor.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::.cctor][mdT
    oken=0x6000001][offset 0x000000FE] Unrecognized arguments for delegate .ctor.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::.cctor][mdT
    oken=0x6000001][offset 0x00000148] Unrecognized arguments for delegate .ctor.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::<.cctor>b__
    0][mdToken=0x6000007] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::<.cctor>b__
    2][mdToken=0x6000008] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::<.cctor>b__
    4][mdToken=0x6000009] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::<.cctor>b__
    6][mdToken=0x600000a] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::<.cctor>b__
    8][mdToken=0x600000b] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::<.cctor>b__
    a][mdToken=0x600000c] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender::<.cctor>b__
    c][mdToken=0x600000d] The located assembly's manifest definition does not match the assembly reference.
    The located assembly's manifest definition does not match the assembly reference.
    The located assembly's manifest definition does not match the assembly reference.
    The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender+<>c__Display
    Class14::<.cctor>b__1][mdToken=0x6000036] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender+<>c__Display
    Class16::<.cctor>b__3][mdToken=0x6000038] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender+<>c__Display
    Class18::<.cctor>b__5][mdToken=0x600003a] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender+<>c__Display
    Class1a::<.cctor>b__7][mdToken=0x600003c] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender+<>c__Display
    Class1c::<.cctor>b__9][mdToken=0x600003e] The located assembly's manifest definition does not match the assembly reference.
    [IL]: Error: [E:\projects\CLPS\CLPS.git\builds\CID.CLPS.WCFHost\Common.Logging.Log4Net.dll : Common.Logging.Log4Net.CommonLoggingAppender+<>c__Display
    Class1e::<.cctor>b__b][mdToken=0x6000040] The located assembly's manifest definition does not match the assembly reference.

    25Error(s) Verifying Common.Logging.Log4Net.dll

    In addition we use the .Net Reflactor and search for P/Invoke calls, so we find

    - Mongo.Driver

    - log4net

    For better investigation we also use Paged Heap Verfication, to force CLR to stop the process at the moment the Heap gets corrupt. It seems that the GC produces to much false positives durch compact/collect phase.  So what can we do for further investigation.

    Our application is build as .net 4 using GCServer non-concurrent and runs on .net 4.5 environment.

    Thanks for any ideas and help.

    Sven



    Friday, August 22, 2014 6:48 AM