Hello all. I'm running SQL 2008/Enterprise/64bit on a Windows 2008 Ent Server (16G memory). We are in the process of moving from SQL 2005 to SQL 2008, and while running several of our SSIS packages we notice the memory on the SQL 2008 server spikes quite dramatically during package execution to the point that it runs out of memory. If we throw more memory on the server, SSIS just chews it up. Even if I stop the packages before the memory is exhausted, the memory is **NOT** released. These same packages have run flawlessly for 5+ years on SQL 2005. The packages simply ETL 9 tables from Server A (SQL 2005) to Server B (SQL 2008). 6 of the tables have 10+ million records. Running these packages moving the same tables from SQL 2005 to another SQL 2005 server works fine. The only way to reclaim the memory, is to restart SQL Server.
Has anyone seen, or has Microsoft confirmed, a memory leak problem in SSIS 2008? I'm interested in any feedback, or resolution that anyone can offer.
I am facing the same problem, whenever i run packages to transfer 100, 000 records weekly, SQL server occupy all the memory. I have windows 2008 R2 and SQL Server 2008 64bit with 32 GB memory.
I have read about it and microsoft suggest to define the min and max memory allocation for SQL server. I did not try that.
To reclaim the occupied memory, i do restart the SQL Server Service.
I did not find any solution for it the only way as you said is to restart the Service.
Mustafah, thanks for the reply. I opened a case with Microsoft yesterday, and suggested the fix in the article below. Basically, its a Cumulative update 6 for SP 1. I installed this patch, but my problem remains. Also, I set the min/max memory settings, no success. I'm resuming my conference call with Microsoft, since I'm unable to move forward at this point.
The packages are simple Dataflow packages that export data from a SQL table, and import the data into a similiar structure table on the destination database. The source database is SQL 2005, and the destination is SQL 2008. I just ran the ETL package from the SQL 2005 server (instead of running from SQL 2008), and memory still spiked.
If your problem is resolved by restarting the SQL Server Service, then IT HAS NO RELATION TO SSIS.
Let me repeat that - the SQL Server Service is NOT SSIS. If the SQL Service is consuming "too much" memory, the fault does not lie with Integration Services.
However, there is very likely a VERY SIMPLE explanation - and it has nothing to do with a memory leak. Heck, I'm not even a DBA, yet I've answered this question at least twice this week. (Not a rant against you - except for possibly not even searching here for an answer :))
The issue you're facing is that you haven't properly defined a memory limit for SQL Server. SQL Server does NOT release memory once it's acquired it from the system - EVER. That is a design choice that has ALWAYS existed in SQL Server (for as long as I can recall - 10 years or so). The difference between your 2005 environment and your 2008 environment is simply memory settings of the server. In 2005, either by default or by changes to the configuration, there was a limit to memory consumption of the service that stopped SQL Server from allocating (what you think is) "too much" memory. All you need to do is to go to SSMS, right-click on the server, select Properties, change to the Memory tab, and set the Maximum value to a "reasonable" limit. There are plenty of resources to help you determine what that limit is.
You do NOT have a memory leak.
- Proposed as answer by Alain de la Kethulle Wednesday, April 21, 2010 12:52 PM
I appreciate your spirited reply. However, I must inform you that the Max memory value is set. Just like its set on our SQL 2005 server. Your reply states that the problem has "no relation to ssis". funny then that the problem ONLY occurs when running SSIS. We do other heavy I/O, and memory intensive processes; yet these don't tax the server memory repeat ***these processes DO NOT** tax the server. Only SSIS does. And yes, these same processes when completed on our SQL 2005 server release the memory they consumed. We've ran these same ETL processes for years on SQL 2005 with no problem.
- Edited by Roshall Tuesday, March 23, 2010 5:13 PM
I apologize if my prior response was "spirited" (I appreciate you not responding with guns blazing). I apologize in advance if this one also appears "spirited" - I don't intend to be mean.
Run your package and please watch your Task Manager as the package is running. Watch the memory consumption of "DTExec" - that's the SSIS process - and "sqlserver" - that's the SQL service. As soon as your package is done or killed, DTExec goes away, releasing all memory. If you're seeing "sqlserver" have a ton of memory allocated - that's SQL Server. Agreed?
I only say that the problem has no relation to SSIS because any other process that exercised your SQL Server would generate the same behaviour. Perform a cross-join on two of your largest tables to try that out. Heavy I/O is not relevant, nor are other memory intensive processes. You're looking at SQL Server's memory consumption, correct?
You say you do have your max memory set on SQL Server. What's it set to? What memory does SQL Server end up consuming at the "end" of your package run? Does it exceed your max setting?
Thanks for your reply. Although I did not raise this thread but I am facing the same memory problem. Here are my findings as you have suggested to test.
I have SQL Server 2008 64 bits on Windows 2008 R2 server with 32 GB RAM.
Minmum Memory in (MB) = 0
Maximum Memory in (MB) = 2147483647
I have a package in which I am using a Look up transformation and transfer 100, 000 Records every week. Similarly, I have another package in which I am using SCD to maintain history and transfer 150,000 records each week. I run these packages monday, tuesday night respectively.
No other process runs except these packages. When I just run the noraml queries or reporting services it occupies 15gb or max 18 gb depending on the users connected to the server.
But When I run these packages monday and tuesday night. It occupies almost full memory i.e.(31 GB) and does not release after package completion.
I am using Data Flow Task, Look up transformation, SCD and some data conversion tasks in the packages. Packages are working fine and transfering the data as I want. Now in this case what should I check or add in the package to release the memory?
SQL Server caches as much as possible data in it's memory. Therefore the more memory you have, the more responsive your server will be when accessing the data, be it with Reporting Services (or any other Reporting Tool) or whatever other tool.
If your server runs other software then I would advice limiting SQL Server Memory allocation by letting enough memory for OS and remaining software to run smoothly. Work with ratios (ie. 80% SQL Server, 20% remaining (OS, SSIS, others...)). It will be easier.
How much will be needed is up to you to determine. You're the only one knowing your server configuration and it's pressure.
- Proposed as answer by Alain de la Kethulle Wednesday, April 21, 2010 12:52 PM
You mentioned that there are sites that can help you determine how much memory to allocate to SQL Server, do you have any suggested sites? We are running SQL 2008 R2 on a Windows 2008 R2 Server 64 bit and the only other programs on this machine are Visual Studio 2008, Toad for Oracle 11g and Sharepoint 2010. We have 4 GB memory. I'm thinking 80% to SQL?
I am currently testing out your Kimball Method SCD task but even when SSIS isn't processing anything, our memory consumption is around 85%. It gets up to 95% when processing through this task. I'm pushing 1M rows right now. I'm not worried about your SCD task causing any issues (in fact it has sped up our processing considerably, thanks for that...), its just that without properly setting up our memory limits, we're going to have issues.
Any additional pointers are greatly recommended!
I'm not an expert - or even a journeyman - on setting up a server optimally. I'm afraid I can't help you much there - except for rules of thumb. One of those would be - why is VS2008 and Toad on that server? Those are typically client tools used for development. As such, they don't take any resources unless someone is logged on with RDP and using them - which isn't typically what a production server is intended for. Removing those won't "help" if they're not used - because they'd only consume resources if someone was logged on.
As for sharing between SQL and SharePoint - I don't know how memory intensive SharePoint is. You'd have to conduct some load testing to figure out how little memory you could get away with using SharePoint. Or at what point allocating memory to SharePoint (which means not allocating it to SQL) has little effect on responsiveness.
Glad you're trying out the (newly renamed) Dimension Merge SCD. It will compete heavily against SQL Server when it comes to memory... and lose.
Keep in mind that allocating memory to SQL is done to allow SQL to "cache" important things. And SQL will take as much advantage as it can of that allocation, consuming it all and it won't release it, even if it isn't currently using it. An option would be to reduce that allocation while in your ETL window with stored proc calls, then elevate it when the ETL is done. Or buy more RAM or another server to scale out your processing... :)
Talk to me now on
I'm having the same issue. I have a server that ONLY runs SSIS. It is not releasing (or for that matter reusing) the memory that is allocated even after successful completion.
Did you get a resolution on this?
Any assistance you could provide would be appreciated.
We are also facing the same issue. We have a series of packages being triggered in sequence from an application. The memory consumed by each package does not get released after the package execution is over. Hence after some time the server runs out of memory.
We are using Sql Server 2008 R2 SP1 and the server has 16 GB RAM.
There is some issue with the memory de-allocation in Sql server 2008. Any help will be highly appreciated.
I checked other threads on the same topic and as you have mentioned it is not a memory leak. DTEXE is not blocking any memory, but it is being consumed by sql sever. Hence the best way to is to limit the max server memory or restart Sql server after the package has been executed.
I am having the same problem, even though the MaxServerMemory has been limited and there is 10Gb left for the other applications / OS etc.
Once I restart the server everything is fine, until the first data load has been run using SSIS, thereafter the performance deterioates until the server is restarted again.
Has anyone managed to resolve this issue?