locked
Can I use MSVCRT100.dll with Visual C++ 2013? RRS feed

  • Question

  • Hi - apologies if this is the wrong place to ask this.  I would like to upgrade to Visual C++ 2013.  But my main product application uses many components provided by numerous third parties.  It also uses the MSVCRT100.dll dynamically linked (i.e. I currently use Visual C++ 2010).   I have found from painful experience that *all* components I use must all be linked to the same version of the MSVC runtime DLL or I will get bugs in my app (usually very hard to track bugs too).  One component (itself a DLL) used static linking to the runtime library because the publishers believed that by doing so, it would get round the problem of runtime library incompatibility.  It didn't.  It was only when I was able to persuade the publishers to let me have the source code so that I could do my own build of their DLL, dynamically linked, to MSVCRT100.dll that the problems finally went away.

    Some of the components I use are no longer being developed.  Even the ones that are still being developed do not necessarily have builds using anything later than MSVCRT100.DLL.  It was very hard-going to get builds of all components using that.  And even if a more recent version of each component was available, I don't want to have re-purchase all my components every time I upgrade. 

    Is there a solution to this that I have missed?  If not, is anyone looking at this issue in Microsoft?  For me, it is a major problem.  People used to talk about DLL Hell; but for me MSVCRT Hell is as bad or worse.  Please tell me I've missed something...  Thanks.


    Simon

    Tuesday, October 29, 2013 9:02 PM

Answers

  • "When a CRT allocation is made with one CRT, and then destroyed by another for example."

    That shouldn't be a problem with VS2010-VS2013 'release' runtimes. Starting with VS2010 the CRT no longer implements its own heap manager, it uses the Win32 process heap. malloc calls HeapAlloc and free calls HeapFree. It should be possible to mix malloc/free from 2010 and 2013 CRTs without issues. Same applies to the default new/delete operators.

    With the 'debug' version of the CRT things are a bit more complicated. malloc still uses HeapAlloc but adds a header to the allocated block. If the size of the header varies between CRT versions (I don't know) then it's not possible to mix debug malloc/free calls. And of course, it's impossible to mix debug/release malloc/free even if the CRT version is the same.

    Thursday, November 7, 2013 9:39 AM
  • Hi Simon,

    Mixing modules with built with different versions of the CRuntime is tricky, but not impossible. You can mix modules compiled with different versions of the CRT, provided the modules don't share anything across module boundaries that are library specific.

    In addition to memory allocation/deallocation, another example would be the use of a FILE* created by one version of the CRT, and used in a module linked with a different version of the CRT. Probably the largest problem is attempting to share C++ objects across module boundaries.  With major changes to the C++ Standard Library every release, trying to share C++ things across module boundaries will likely result in some nasty bugs. For example, trying to throw a C++ exception across a module boundary where the throwing and catching modules are not using the same CRT will simply not work.

    Hence the recommendations to use C exports or COM on module boundaries, or ensuring you always use the same CRT for all modules that share state.

    Sincerely,


    Ed Dore

    Friday, November 8, 2013 11:02 PM
  • Thanks Brian.  But if this problem is so well-known, why isn't Microsoft doing something about it? 

    Trust me, they (and others) have studied this thoroughly. The issues actually run much deeper than what you realize. WinSxS, for example, was a technology that was meant to address DLL Hell, but when you have been on these forums for as long as I have, you realize that it simply replaces one set of problems with another set. Do a Google/Bing search on this subject for a full discussion and understanding on this subject.

    I hear your frustration, but bear in mind there are strategies to mitigate against this issue. Borden has suggested one. I've suggested another (using COM). And keep up the pressure on the third-party vendors... they contribute to this problem as well.

    Wednesday, October 30, 2013 3:32 PM

All replies


  • Is there a solution to this that I have missed?  If not, is anyone looking at this issue in Microsoft?  For me, it is a major problem.  People used to talk about DLL Hell; but for me MSVCRT Hell is as bad or worse.  Please tell me I've missed something...  Thanks.



    This is DLL Hell that you are talking about and it's been around for decades. There have been various attempts over the year to mitigate against DLL hell, but there has never been a successful solution IMHO.

    The sad fact is that it is possible for a third party vendor to create a DLL which doesn't require the client to use a specific version of the C Runtime, but very frequently most vendors don't devote the time and energy to make their DLL compiler independent. The only safe solution is to ensure your code and all the third party code is using the same version of the MSVCRTxxx.DLL. If you mix runtime DLL's, you are on your own. In our organization, for example, we build our DLL's as COM servers, and these are statically linked. With this approach, we have had no problem mixing such DLL's with client code built with different compiler versions. But since you have no control over some of your components, my recommendation is to stick with Visual Studio 2010.


    • Edited by Brian Muth Wednesday, October 30, 2013 3:35 AM spelling
    Tuesday, October 29, 2013 9:35 PM
  • Thanks Brian.  But if this problem is so well-known, why isn't Microsoft doing something about it?  At the very least, the obvious thing to do would be to stop changing the runtime library every time they update the compiler.  Visual C++ 2012 and Visual C++ 2013 should have both used MSVCRT100.dll.  If they wanted to add new runtime code, they could have created new add-on runtime libraries.  Or alternatively, they could have made later versions of the runtime library backwardly compatible with earlier versions.  There are so many things they could have done.

    I used Visual C++ 6 for years.  If memory serves that came out in 1997.  I skipped all the intermediary versions and only finally upgraded to Visual C++ 2010 when that came out.  And a big part of the reason that it took that long for me to upgrade was this runtime library compatibility problem.  Does Microsoft really want me, and people like me, to wait another 13 years before we next upgrade?  It just seems ludicrous that component providers have to make available long lists of different versions of their products, built against different MS runtime libraries, and you have to pick the specific one you need, or wait until that one is available (if ever).  There has to be a better way.

    Any new features that MS provide in their new versions are of no interest to me if they can't or won't solve this incompatibility problem.

     


    Simon

    Wednesday, October 30, 2013 11:56 AM
  • It just seems ludicrous that component providers have to make available long lists of different versions of their products, built against different MS runtime libraries, and you have to pick the specific one you need, or wait until that one is available (if ever).  There has to be a better way.

    for this reason I link my code to the static liebraries. The code size is larger, but at least storage is no problem anymore.

    Best regards

    Bordon

    Note: Posted code pieces may not have a good programming style and may not perfect. It is also possible that they do not work in all situations. Code pieces are only indended to explain something particualar.

    Wednesday, October 30, 2013 12:06 PM
  • Thanks Brian.  But if this problem is so well-known, why isn't Microsoft doing something about it? 

    Trust me, they (and others) have studied this thoroughly. The issues actually run much deeper than what you realize. WinSxS, for example, was a technology that was meant to address DLL Hell, but when you have been on these forums for as long as I have, you realize that it simply replaces one set of problems with another set. Do a Google/Bing search on this subject for a full discussion and understanding on this subject.

    I hear your frustration, but bear in mind there are strategies to mitigate against this issue. Borden has suggested one. I've suggested another (using COM). And keep up the pressure on the third-party vendors... they contribute to this problem as well.

    Wednesday, October 30, 2013 3:32 PM
  • Thanks Brian and Bordon.  Although I agree that the MSVCRT100.dll is an example of the more general DLL Hell problem, I suppose my point is that it deserves special status because it is much harder for Visual C++ users to solve than other DLL hell problems, and also the only one that is pretty much unavoidable if you use almost any 3rd party components.  I know MS provide other DLLs.  But none of these cause me any problems at all.  It's just MSVCRT100.dll.  And that one seems to affect everything.

    If I'm right and MSVCRT is in a special category all of its own, it still seems to me that Microsoft could very easily solve the problem by simply never updating it (there may be other solutions too, but that one would work).  Or if they absolutely must, couldn't they do it once every 10 years, rather than every new release of their compiler?

    The only solution mentioned so far that really would be practicable for me (I can't get third party component providers to suddenly change what they do and what they offer), would be for me to link to the MSVCRT as a static lib.  But would that solve the problem?  As I mentioned in my earlier post, I have linked to a DLL which was built using static linking to MSVCRT and that did not work.  I got bugs which I was only able to fix by getting hold of the source code of the component in question, and rebuilding it myself to dynamically link to MSVCRT100.DLL.  That particular solution is not generally available to me of course.  V. few component publishers would let me have the source code of their component.  But anyway - granted that the app statically linking is a little different to a DLL statically linking, are you quite sure that if I statically linked my app to (say) MSVCRT130.DLL and all my components dynamically link to MSVCRT100.DLL, I could be 100% guaranteed that this would work and not introduce bugs?  I don't have a good enough understanding of what the problem is with the incompatibilities to be sure about this, but based on my previous experience, I would have assumed that that wasn't safe either.  Am I wrong?


    Simon

    Monday, November 4, 2013 12:21 PM
  • Hi Simon,

    Having been in the "support business at MS" for more C++ releases than I can shake a stick at, I feel your pain. I've worked 100's of support issues due to issues related to components and applications built with mismatched CRTs.

    Simply put, the C++ team does not have the option of not shipping updated runtimes. There are security issues, bug fixes, and new C++ standards that invariably introduce breaking changes to the CRT components from release to release. It's basically a damned if we do, damned if we don't scenario.

    Static linkage means the library code is directly included into your DLL or EXE. That is, you no longer have a dependency on the DLL in question. To clarify, you cannot statically link to MSVCRT130.DLL. Rather, you are actually importing and including the code in the corresponding .LIB file into your own binary, effectively removing any dependency your binary has on the corresponding DLL.

    Sincerely, 


    Ed Dore

    Tuesday, November 5, 2013 9:54 PM
  • OK thanks Ed.  I appreciate that talking of statically linking to MSVCRT130.DLL is nonsense.  It was wrongly put, but I do understand how static linking works.

    As well as using the MSVRT, my app also links to the MFC in a DLL.  Presumably I will need to statically link to that too.  But if I do that - i.e. I statically link to the VC++ 2013 MFC & MSVCRT libraries - you are confident that I can safely link to DLLs that all, themselves, link to the MSVCRT100.DLL?  And that this will work without introducing hard-to-find bugs?  Sorry to press you on this, but it's a major commitment for me to upgrade to 2013, and it would be a major problem for me if I later had to rollback to 2010 because I just can't get 2013 to work.

    The reason I query it is just because I did use a DLL that statically linked to an older version of the MSVCRT, and that (as explained earlier) did not work.  I never found out why.   I just found that I could fix it by rebuilding that component using MSVCRT100.DLL.  Can you think why that might have been necessary?  Are there any possible gotchas that you can think of that I should look out for?  e.g. MSVCRT structs that may have changed, pointers to which may be passed as parameters to, or returned by, component DLLs?  Or perhaps some subtle TLS issue that could cause problems?  Or anything else?  (I don't believe any of my component dlls use the MFC incidentally). Any tips, v. welcome.


    Simon

    Wednesday, November 6, 2013 1:21 PM
  • Personally I'd be very hesitant to migrate your project to VS 2013 when you are dependent on third party DLL's that require an earlier runtime. Just natural caution on my part.

    What is the motivation for upgrading to Visual 2013 from Visual 2010? Is there some compelling reason for you to upgrade? More specifically, what does Visual 2013 offer than can't be accomplished with Visual 2010?

    Wednesday, November 6, 2013 4:03 PM
  • Hi Simon,

    Without debugging the issue, with your usage of the older msvcrt, it's hard to say. Most issues I've seen around mismatched CRTs, are memory/heap corruption issues. When a CRT allocation is made with one CRT, and then destroyed by another for example.

    Without knowing more about your 3rd party library, and what sort of parameters are typically passed via it's API's, I'd be hard pressed to "guarantee" it would work.

    I'll second Brian's opinion. If your 3rd party library vendor has not yet released a VS 2013 variant of the libraries you are dependent upon, it's probably best to wait until they update their bits.

    Sincerely,


    Ed Dore

    Thursday, November 7, 2013 5:45 AM
  • "When a CRT allocation is made with one CRT, and then destroyed by another for example."

    That shouldn't be a problem with VS2010-VS2013 'release' runtimes. Starting with VS2010 the CRT no longer implements its own heap manager, it uses the Win32 process heap. malloc calls HeapAlloc and free calls HeapFree. It should be possible to mix malloc/free from 2010 and 2013 CRTs without issues. Same applies to the default new/delete operators.

    With the 'debug' version of the CRT things are a bit more complicated. malloc still uses HeapAlloc but adds a header to the allocated block. If the size of the header varies between CRT versions (I don't know) then it's not possible to mix debug malloc/free calls. And of course, it's impossible to mix debug/release malloc/free even if the CRT version is the same.

    Thursday, November 7, 2013 9:39 AM
  • Thank you to everyone who has responded to this.  I am pretty much persuaded by those who advise that I should stick with VC++ 2010.  Why do I need to upgrade?  I don't.  At least not in the short-term.  I can carry on as I am for a while.  But if the past is anything to go by, eventually it will become impossible to be so far behind the leading edge, and I will eventually need to upgrade.  At which point, I will have to confront these issues again.  I raised these questions in the hope that I could avoid the problem and adopt a strategy which would allow me to keep up-to-date on a rolling basis.  But it sounds like I probably can't do that without risking destabilizing my product.

    When the day comes that I eventually upgrade, I will try the strategy of building using static libraries and see how I get on with that.  But for now, I will stick with 2010.


    Simon

    Thursday, November 7, 2013 12:09 PM
  • Hi Pavel,

    The fight hasn't been abandoned. To the contrary, it's definitely on their radar. They just aren't in a position to announce anything yet.

    As for the complexity of ensuring backward compatibility, Jim Radigan's Compiler++ talk at Sept. Going Native 2013 conference, may help explain just what an undertaking that would be. While his talk was about the compiler, it's pretty easy to extrapolate a similar effort for the CRuntime.

    Sincerely,


    Ed Dore

    Friday, November 8, 2013 10:51 PM
  • Hi Simon,

    Mixing modules with built with different versions of the CRuntime is tricky, but not impossible. You can mix modules compiled with different versions of the CRT, provided the modules don't share anything across module boundaries that are library specific.

    In addition to memory allocation/deallocation, another example would be the use of a FILE* created by one version of the CRT, and used in a module linked with a different version of the CRT. Probably the largest problem is attempting to share C++ objects across module boundaries.  With major changes to the C++ Standard Library every release, trying to share C++ things across module boundaries will likely result in some nasty bugs. For example, trying to throw a C++ exception across a module boundary where the throwing and catching modules are not using the same CRT will simply not work.

    Hence the recommendations to use C exports or COM on module boundaries, or ensuring you always use the same CRT for all modules that share state.

    Sincerely,


    Ed Dore

    Friday, November 8, 2013 11:02 PM