locked
Stored procedures & Temp tables: huge memory lack? RRS feed

  • Question

  • Hi all,

    I've got (huge) memory issues when using temp tables in stored procedures; I just monitored all my SP calls (with the great Windows Performance Monitor Tool) and only one of them uses temp tables (that's why I guess the issue comes from temp tables).

    So my issue is the following:

    I have to check data consistency of a database, on a daily basis. To do this, I've got a cmd batch that calls my stored procedures. When calling one of them, amount of available RAM on my laptop (Windows 7 64 bits) goes from around 50% free to less than 5%...

    Here's the T-SQL code I suspect to be responsible of this lack of memory:

    	BEGIN TRY DROP TABLE #tmp1 END
    	TRY BEGIN CATCH END CATCH
    	
    	SELECT DISTINCT CAST(valeur as decimal(18,4)) as 'Valeur'
    	INTO #tmp1
    	FROM FT_sas
    	ORDER BY CAST(valeur as decimal(18,4)) ASC
    	
    	SELECT TOP(10) PERCENT CAST(valeur as decimal(18,4)) as 'Valeur' FROM #tmp1
    	WHERE valeur is not null
    	ORDER BY CAST(valeur as decimal(18,4)) ASC

    I'm using 4 temp tables like the above one in the same procedure (#tmp1 to #tmp4); amount of data inserted into those temp tables is very variable and does not have any consequence on my issue (RAM falls the same way, regardless of the amount of data to retrieve and insert).

    At the end of the procedure, I delete contents of the 4 tables, then drop them:

    	DELETE FROM #tmp1
    	DELETE FROM #tmp2
    	DELETE FROM #tmp3
    	DELETE FROM #tmp4
    	
    	DROP TABLE #tmp1
    	DROP TABLE #tmp2
    	DROP TABLE #tmp3
    	DROP TABLE #tmp4

    Any idea's greatly appreciated :)

    Best regards from France,

    Etienne


    Etienne -------------------------------------------- ---------------------------------------------------- "La demande mondiale en ordinateurs n'excédera pas 5 machines." - Thomas Watson, fondateur d'IBM, 1945


    • Edited by Etienne J Tuesday, April 1, 2014 11:25 AM
    Tuesday, April 1, 2014 11:24 AM

Answers

  • It is normal for SQL Server to acquire and hold onto large amounts of memory.  This improves performance by reducing disk I/O.  In applies to temp tables as well.

    SQL Server will release memory after it detects memory pressure at the OS level but it is generally best to proactively set aside memory for the OS and other applications.  This can be done by setting SQL Server 'max server memory' (in MB).   You mention laptop but didn't say how much RAM.  On a laptop with 4GB, I suggest a setting of 1GB or so.  For example:

    EXEC sp_configure 'show', 1;
    RECONFIGURE;
    EXEC sp_configure 'max server memory', 1024;
    RECONFIGURE;


    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

    • Marked as answer by Etienne J Tuesday, April 1, 2014 1:18 PM
    Tuesday, April 1, 2014 12:11 PM

All replies

  • It is normal for SQL Server to acquire and hold onto large amounts of memory.  This improves performance by reducing disk I/O.  In applies to temp tables as well.

    SQL Server will release memory after it detects memory pressure at the OS level but it is generally best to proactively set aside memory for the OS and other applications.  This can be done by setting SQL Server 'max server memory' (in MB).   You mention laptop but didn't say how much RAM.  On a laptop with 4GB, I suggest a setting of 1GB or so.  For example:

    EXEC sp_configure 'show', 1;
    RECONFIGURE;
    EXEC sp_configure 'max server memory', 1024;
    RECONFIGURE;


    Dan Guzman, SQL Server MVP, http://www.dbdelta.com

    • Marked as answer by Etienne J Tuesday, April 1, 2014 1:18 PM
    Tuesday, April 1, 2014 12:11 PM
  • Hi all,

    So my issue is the following:

    I have to check data consistency of a database, on a daily basis. To do this, I've got a cmd batch that calls my stored procedures. When calling one of them, amount of available RAM on my laptop (Windows 7 64 bits) goes from around 50% free to less than 5%...

    Here's the T-SQL code I suspect to be responsible of this lack of memory:

    I guess issue is not with stored proc but CHECKDB command which you run.Checkdb command is highly resource intensive and utilized memory I/O and tempdb(in some case heavily).So I guess what you are seeing is normal behavior when running checkdb

    Why not just create simple job in SQL agent to run checkdb directly instead of using temptables


    Please mark this reply as the answer or vote as helpful, as appropriate, to make it useful for other readers


    Tuesday, April 1, 2014 12:23 PM
  • My request wasn't clear, I guess... I don't run CHECKDB command at all; I just perform some functional (i.e. business) checks to ensure that database contents are consistent before pushing the linked application into production :)

    Thanks anyway for your help ;)


    Etienne -------------------------------------------- ---------------------------------------------------- "La demande mondiale en ordinateurs n'excédera pas 5 machines." - Thomas Watson, fondateur d'IBM, 1945

    Tuesday, April 1, 2014 12:42 PM
  • It is normal for SQL Server to acquire and hold onto large amounts of memory.  This improves performance by reducing disk I/O.  In applies to temp tables as well.

    SQL Server will release memory after it detects memory pressure at the OS level but it is generally best to proactively set aside memory for the OS and other applications.  This can be done by setting SQL Server 'max server memory' (in MB).   You mention laptop but didn't say how much RAM.  On a laptop with 4GB, I suggest a setting of 1GB or so.  For example:

    EXEC sp_configure 'show', 1;
    RECONFIGURE;
    EXEC sp_configure 'max server memory', 1024;
    RECONFIGURE;
    Dan Humbly i dont agree to your post of setting 1 G for Max server memory if 4 G is total RAM present.I guess in this case 3 G should be for SQL Server and 1 G for OS. Although this is tentative value

    Please mark this reply as the answer or vote as helpful, as appropriate, to make it useful for other readers


    Tuesday, April 1, 2014 12:46 PM
  • Thank you for your very quick answer.

    I'm sorry for the lack of precisions: I've a laptop with 4GB, as you mentioned in your example.

    Indeed, this seems to fix the issue... RAM now goes from 42% available to 57% at the end of all treatments.

    Some question, however: the application that queries this database is very critical for the business and goes to production today... So I don't want to take any risks on this transverse development. Do you have any suggestions to secure it, knowing that I have (for the moment) no idea about the production environments caracteristics (OS, RAM, CPUs...)?

    Thanks for your help!


    Etienne -------------------------------------------- ---------------------------------------------------- "La demande mondiale en ordinateurs n'excédera pas 5 machines." - Thomas Watson, fondateur d'IBM, 1945

    Tuesday, April 1, 2014 12:49 PM
  • Thanks for this precision.

    However, I've just been informed that such a query cannot be pushed into the production databases...

    Looks like I should find an alternative to temp tables :)

    Thanks to all contributors anyway!


    Etienne -------------------------------------------- ---------------------------------------------------- "La demande mondiale en ordinateurs n'excédera pas 5 machines." - Thomas Watson, fondateur d'IBM, 1945

    Tuesday, April 1, 2014 1:18 PM
  • Thank you for

    Indeed, this seems to fix the issue... RAM now goes from 42% available to 57% at the end of all treatments.

    machines." - Thomas Watson, fondateur d'IBM, 1945

    Have you set max server memory to 1 G.Memory consumption went doen because you lowerd the value for max server memory dont be happy with memory going  down what you did is put a restriction on SQL Server to just use 1 G which IMO is dangerous and SQL Server will starve for memory.


    Please mark this reply as the answer or vote as helpful, as appropriate, to make it useful for other readers

    Tuesday, April 1, 2014 1:25 PM
  • Thank you for

    Indeed, this seems to fix the issue... RAM now goes from 42% available to 57% at the end of all treatments.

    machines." - Thomas Watson, fondateur d'IBM, 1945

    Have you set max server memory to 1 G.Memory consumption went doen because you lowerd the value for max server memory dont be happy with memory going  down what you did is put a restriction on SQL Server to just use 1 G which IMO is dangerous and SQL Server will starve for memory.


    Please mark this reply as the answer or vote as helpful, as appropriate, to make it useful for other readers

    Yes, indeed, I did that. I understood well that this isn't the better way to fix my issue... And that my customer doesn't want to change SQL Server configuration for this little project, with the risk of creating other issues on the production environments...

    Locking SQL Server memory isn't the real solution to my issue. My colleagues found a solution to avoid the use of temp tables in that case or at least to limit their use to only one temp table.


    Etienne -------------------------------------------- ---------------------------------------------------- "La demande mondiale en ordinateurs n'excédera pas 5 machines." - Thomas Watson, fondateur d'IBM, 1945

    Tuesday, April 1, 2014 2:24 PM