none
Visual Studio 2005 runs out of memory when trying to use SSIS package

    Question

  • Visual Studio runs out of memory when trying to use SSIS package. I am trying to create and run a SSIS package that validates and imports some large xml files >200MB. Validation fails because Visual studio cannot open large files without running out of memory.

    The SSIS package throws this error when I run the package..at the validation task.

    Error: 0xC002F304 at Validate bio_fixed, XML Task: An error occurred with the following error message: "Exception of type 'System.OutOfMemoryException' was thrown.".

    How do I increase the amount of RAM that VIsual Studio can use...I have plenty of RAM on my workstation >3GB, but VS chokes maybe around 100MB files?

    Thanks,

    Forrest

     

     

     

     

    Wednesday, July 12, 2006 5:39 PM

Answers

  • This in some ways sounds similar to a KB article I wrote last year  http://support.microsoft.com/default.aspx/kb/952110

    The design surface for SSIS - BIDS (DevEnv.exe) is always a 32-bit application, even on 64-bit machines (runs in WOW64). There is no 64 bit compile of the development studio or visual studio IDE's in general. We've asked before, but its not coming out soon unfortunately. 

    In 32-bit there is only 2 GB of user addressable memory (Virtual address space) per process. In general, Microsoft XML uses a special heap for managing its memory. MDAC and the connectivity components use their own heaps. .Net framework uses its own heap and memory manager (with garbage collection) for Managed code running in DevEnv. Once a certain heap grows, it typically doesn't shrink, and the empty space is left reserved for re-use later. Due to the number of specialized heaps used within the process, this can cause an application which seems to take little RAM (private bytes in Task Manager - Processes tab) may have used all its address space up (virtual bytes) and show out of memory errors.

    Unfortunately there is not much you can do in SSIS at design time to get around this, except keep the packages reasonably small (few MB .dtsx each). SSIS depends on Visual Studio which has memory demands coming from many different teams, but hope they can give us more flexibity in the future editions.

    MSXML in general (the API's behind the XML Task) are notoriously memory hungry, so I don't find it uncommon that large XML files cause this trouble. We have several reports of that from customers in CSS (support), and there isn't much we can offer to workaround, except perhaps code a custom solution. System.XML is avaiable for referencing within an SSIS Script task, if you want to try to roll your own custom solution to workaround the issue. http://msdn.microsoft.com/en-us/library/system.xml.xmlreader.aspx

    Do you see private bytes or virtual bytes approach the 2 GB limit when the error is experienced? These memory counters are under the Process counter object in Perfmon, and are specific to each process, such as DevEnv.exe.  You can add them in Perfmon (Start > run > perfmon.exe) to watch the graph manually, or start a collector to run in the background as you repro the situation, then open the .blg file in perfmon after the fact.

    The SSIS product group is aware of the BIDS limitations of memory, and also with XML task in general, and I will try to add more feedback about processing large XML files. You can sumbit the steps to repro at Connect.microsoft.com to file a tracking item with the product group, or vote on an existing ask with the same problem if it already exists.

    Thanks, Jason H
    Monday, January 12, 2009 5:29 AM

All replies

  • I have the same problem. I'm can't even save my package after making changes.

    I have 1GB ram and i don't have other apps running, my package size is around 15MB, containing around 30 sequence containers with three control flow each. I'm not even executing the package, just designing it. and i can't save the package. WHy does VS2005 do this?

    Urgent replies would be much appreciated because the project i'm working on is almost due!

    Thursday, August 17, 2006 11:03 PM
  • the problem exists in my package as well.

    My scenario is a little different. I'm just working in the designer and it doesn't let me save the package after making my changes.

    I've got around 30 sequence containers with 3 control flow tasks each. My package size is around 15MB.

    My hardware has 1GB ram.

    How do i get around this?

    Thursday, August 17, 2006 11:06 PM
  • Has anyone found any sort of resolution to this problem? Enzoe the workstation I use is similar to yours, with 1gb of ram and the packages I am working with are between 8 and 15 mbs. I have tried everything I can possibly think of to solve this issue including installing a hotfix and the recently released service pack 1 beta for visual studio 2005.


    Wednesday, September 27, 2006 6:15 PM
  • So far no one has given any reason for it.

    Anyway, i have since split the packages into several logical packages and the problem's gone.

    The only disadvantage is that the number of configuration files increased and when you've got packages that share config files, you can't use the 'Build' function. You've got to copy the dtsx and dtsconfig files manually to the 'bin\Deployment\ folder.

     

    Thursday, October 05, 2006 2:06 AM
  • Experiencing the same, splitting the package seems to be the only possible workaround
    Tuesday, April 22, 2008 7:43 AM
  • I am having a similar problem. I try to import a relatively big table from a PostgreSQL database to MSSQL Server 2008. After mapping the columns and saving the ADO.NET datasource, I get this Error message:

    TITLE: Package Validation Error
    ------------------------------

    Package Validation Error

    ------------------------------
    ADDITIONAL INFORMATION:

    Error at Quotes Import [ADO NET Source [24]]: An error occurred executing the provided SQL command: "SELECT * FROM quote_table". ERROR [HY000] Out of memory while reading tuples.;
    No query has been executed with that handle

    Error at Quotes Import [SSIS.Pipeline]: "component "ADO NET Source" (24)" failed validation and returned validation status "VS_ISBROKEN".

    Error at Quotes Import [SSIS.Pipeline]: One or more component failed validation.

    Error at Quotes Import: There were errors during task validation.

     (Microsoft.DataTransformationServices.VsIntegration)

    ------------------------------
    BUTTONS:

    OK
    ------------------------------

    The table contains ~1.6 Million tupels. I would also need to move by far bigger amounts of data (~40 Million tupels) Smile Any ideas?
    Thursday, October 30, 2008 5:04 PM
  • I also found that even though I can't save the package with its original name, I was able to perform a "Save As . . ." and save a copy.  After rebooting my PC, I was then able to copy it over the unsaved original, so as to avoid losing my work.

    It doesn't solve the problem, but at least I don't lose my work between saves anymore.
    Friday, January 09, 2009 7:26 PM
  • This in some ways sounds similar to a KB article I wrote last year  http://support.microsoft.com/default.aspx/kb/952110

    The design surface for SSIS - BIDS (DevEnv.exe) is always a 32-bit application, even on 64-bit machines (runs in WOW64). There is no 64 bit compile of the development studio or visual studio IDE's in general. We've asked before, but its not coming out soon unfortunately. 

    In 32-bit there is only 2 GB of user addressable memory (Virtual address space) per process. In general, Microsoft XML uses a special heap for managing its memory. MDAC and the connectivity components use their own heaps. .Net framework uses its own heap and memory manager (with garbage collection) for Managed code running in DevEnv. Once a certain heap grows, it typically doesn't shrink, and the empty space is left reserved for re-use later. Due to the number of specialized heaps used within the process, this can cause an application which seems to take little RAM (private bytes in Task Manager - Processes tab) may have used all its address space up (virtual bytes) and show out of memory errors.

    Unfortunately there is not much you can do in SSIS at design time to get around this, except keep the packages reasonably small (few MB .dtsx each). SSIS depends on Visual Studio which has memory demands coming from many different teams, but hope they can give us more flexibity in the future editions.

    MSXML in general (the API's behind the XML Task) are notoriously memory hungry, so I don't find it uncommon that large XML files cause this trouble. We have several reports of that from customers in CSS (support), and there isn't much we can offer to workaround, except perhaps code a custom solution. System.XML is avaiable for referencing within an SSIS Script task, if you want to try to roll your own custom solution to workaround the issue. http://msdn.microsoft.com/en-us/library/system.xml.xmlreader.aspx

    Do you see private bytes or virtual bytes approach the 2 GB limit when the error is experienced? These memory counters are under the Process counter object in Perfmon, and are specific to each process, such as DevEnv.exe.  You can add them in Perfmon (Start > run > perfmon.exe) to watch the graph manually, or start a collector to run in the background as you repro the situation, then open the .blg file in perfmon after the fact.

    The SSIS product group is aware of the BIDS limitations of memory, and also with XML task in general, and I will try to add more feedback about processing large XML files. You can sumbit the steps to repro at Connect.microsoft.com to file a tracking item with the product group, or vote on an existing ask with the same problem if it already exists.

    Thanks, Jason H
    Monday, January 12, 2009 5:29 AM