none
malloc fails due to virtual size never freed any more under Windows XP and Visual Studio 2005 (debug/release)

    Question

  • Hello,

    I have got a problem with malloc/new in visual studio 2005 and Windows XP.

    If I allocate a lot of small objects and then free these objects the physical memory is allocated and then all is deallocated as expected - no memory leak. But the virtual size displayed in process explorer or perfmon is never freed any more.  So if I allocate a large block again it can happen that I reach the 2GB process boundary and malloc fails even if I have enough free physical memory.

    What is the meaning of this virtual size, how can I manipulate it.

    Must I call a function to free this virtual size - that I don't know, is it a windows problem?
    What must I do in a program to get rid of the - not from me - allocated virtual size.

    Thanks for your help.

    Wednesday, November 19, 2008 9:33 AM

Answers

  • Hello,

    after finding the holy allocation size of 0x7EFF0 where the problem dissapears, I googled and found the following
    link: HeapAlloc heap fragmentation

    The result is "this is by design" :-)

    "The issue here is that the block over 512k are direct calls to VirtualAlloc, and everything else smaller than this are allocated out of the heap segment. The bad news is that the segments are never released ..."

    It's not a bug it's a feature.
    Friday, November 21, 2008 11:51 AM

All replies

  • You can't debug your program with these utilities, the Windows memory manager is far to sophisticated.  Minimize your app's main window for a possible quick fix.
    Hans Passant.
    Wednesday, November 19, 2008 10:19 AM
    Moderator
  • Hello,

    why should I minimize the app's main window?

    Why can't I debug my app with perfmon and/or sysinternals process explorer?

    Process explorer and perfmon show the same behavior even under Windows XP and Vista 64 bit:

    If I allocate a big chunk of data and free it the physical memory and the virtual size grows and on deallocation shrink to the initial amount of memory. But If I allocate a lot of small objects and then free these only the physical memory shrinks to the original amount, but the virtual size not.

    If I now allocate a big chunk of data I get an out of memory error because malloc fails. But there is a lot of physical memory free.

    It is a problem.

    Wednesday, November 19, 2008 10:52 AM
  • Are you sure there is no memory leak?

    remember also dll's you use can leak.. (like odbc drivers etc..)
    Wednesday, November 19, 2008 11:16 AM
  • Hello,

    There is definitifly no memory leak. First I thought of and searched in our big application.
    Then I decided to reduce the problem and wrote a MFC generated sample Doc/View app. with a few lines of code.
    No dlls, no odbc, only plain simple app.

    Here is part of the sample code:

    in the wizard generaded app doc class (member):
    header file:
    TCHAR** pp; 

    cpp file:
    const int n = 12500000;

    in the constructor:

    pp =
    new TCHAR* [n];
    for (int i = 0; i< n; i++)
    {
      pp[i] = new TCHAR[10+1];
    }

    in the destructor:
    for (int i = 0; i< n; i++)
    {
      delete [] pp[i];
    }
    delete [] pp;

    If you look at virtual size and virtual bytes (max) and private bytes with perfom you can see that the virtual memory is never released (only very little part of it).

    Wednesday, November 19, 2008 11:37 AM
  • So, there's no code that executes between the constructor and destructor code you posted that allocates memory as well?  The heap manager can only call VirtualFree() if that's the case.  That's rarely a real problem since released blocks will be coalesced and reused later.  Although memory gets fragmented, possibly failing very large allocations.  Diagnose this stuff with HeapWalk().
    Hans Passant.
    Wednesday, November 19, 2008 11:59 AM
    Moderator
  • Hello,
    as I wrote I created with the Application Wizard in VS 2005 a new MFC document/view application.

    There is code that is executed between the constructor and destructor but that's MFC and not my code.
    VirtualFree can I call when I have called VirtualAlloc, but that is not the case.

    Nevertheless if memory gets fragmented or not, the virtual size should shrink if I release the memory of the allocated memory as would be the case if I allocate a large block at once and then free it.

    It seems to be a windows problem.

    What did I see/ should I look for if I use HeapWalk, I never used this function.
    Wednesday, November 19, 2008 12:13 PM
  • VirtualFree() is used by the Windows heap manager.  Considering how many millions of PCs use the Windows memory manager every day, your most productive approach would be to assume that this code was well debugged.  And, if you have a real problem rather than a worry, that the problem is located in your code.  Be sure to at least use <crtdbg.h> to double-check your assumptions.
    Hans Passant.
    Wednesday, November 19, 2008 12:28 PM
    Moderator
  • Hello,

    o.k. we think that the windows code is well debugged. And I have a real problem and the problem is not in my code as you can see on top, than explain me where the bug is.

    I can send you the whole application and you can see yourself that there must be somewhere a problem.

    Wednesday, November 19, 2008 12:44 PM
  • Hello,

    I found a flag :-) that affacts the application behavior

    _set_sbh_threshold(128);

    If I call this function from App:InitInstance the memory consumtion private bytes and virtual size is what I would expect.
    Now I must look if my application is possible slower or as fast as without this call.

    Wednesday, November 19, 2008 1:54 PM
  • Hello,

    I tested this flag in our application but it didn't help only in the test application it worked.
    I played with the size flag of the function but this didn't help in our application (There we are allocating 64kb blocks with malloc).
    So I got back to the test application and looked at malloc with different sizes.

    Can anyone explain me why the virtual size of a malloc with size 64*1024 isn't freed by windows after a call to free (only the physical memory is freed)
    and the virtual size of a malloc with a size 512*1024 is freed by windows after the call to free?

    Thanks.
    Thursday, November 20, 2008 12:52 PM
  • The Windows heap manager maintains several sub-heaps, each of them tuned to the size of allocation request.  It does this to avoid problems with memory fragmentation.  Another big reason why there's no one-to-one mapping between application heap usage and virtual memory size.
    Hans Passant.
    Thursday, November 20, 2008 12:59 PM
    Moderator
  • I know, also I know that there is a Low fragmentation heap that holds little blocks.

    It is a problem that the virtual memory size is never freed for small allocated blocks that don't fit into the LFH and are less than the 512*1024 bytes.
    All blocks are freed from the virtual memory if they fit into one of the buckets of the LFH

    Buckets Granularity Range
    1-32 8 1-256
    33-48 16 257-512
    49-64 32 513-1024
    65-80 64 1025-2048
    81-96 128 2049-4096
    97-112 256 4097-8192
    113-128 512 8193-16384

    or are greater equal to 512*1024 bytes. The blocks that fall into the range between are never removed from the virtual memory.

    You can look with perfmon at the following win32 console code:

    int _tmain(int argc, TCHAR* argv[], TCHAR* envp[])
    {
    ...
    double
    * p = 0;
    p = (double*)malloc(8*1024*sizeof(double));
    free(p);

    p = (double*)malloc(64*1024*sizeof(double));
    free(p);
    ...
    }

    If you step with the debugger through the code you can see the behaviour of private bytes and virtual size.
    If you add a loop around malloc and store the pointers into an array you can see it even better.

    In our real application we are allocating 64k blocks to hold the data and have a problem because windows displays an "Out of memory error" because the virtual size of the process reached 2GB. If we change the blocksize to 512k we didn't get the error, because the virtual size is as expected. If memory is freed, virtual size is freed also.

    Thursday, November 20, 2008 1:26 PM
  • Hello,

    after finding the holy allocation size of 0x7EFF0 where the problem dissapears, I googled and found the following
    link: HeapAlloc heap fragmentation

    The result is "this is by design" :-)

    "The issue here is that the block over 512k are direct calls to VirtualAlloc, and everything else smaller than this are allocated out of the heap segment. The bad news is that the segments are never released ..."

    It's not a bug it's a feature.
    Friday, November 21, 2008 11:51 AM
  • Hello,

    does anyone play Crysis Warhead, this program seems to have the same problem. After playing a while I got also an "Out of Memory error" - on my home computer with 4 Gb physical memory installed. As I looked with Process Explorer on it I saw that I reached the 2Gb Virtual Size limit.

    Can anyone confirm this?
    Saturday, November 22, 2008 5:00 PM
  • In my application , i increased the page file. But when gloabl memory depleted, i found there is ample free memory there, process explorer confirmed that there is something called "private bytes" that can reach maximum arround 500MB for each process. while in vista private bytes for a process can go upto 1gb (or may be more) and hence problem is not there. My application is required somewhere arroun 650MB to do a task that failed in XP. Wondering can i incrase this maximum limit in XP? I done /3GB in boot.ini and page file to 4 gb (MIN)-8GB (MAX).

    Any ideas will be appreciated?


    AKS
    Thursday, September 16, 2010 10:59 AM
  • Anil Kumar Sharma

    It is better to start a new thread for your new question.

    Thursday, September 16, 2010 11:05 AM