locked
SSIS Job Failure RRS feed

  • Question

  • Can network latency or insufficient memory cause SSIS packages to fail?  I run an agent job that consists of many individual SSIS packages.  Most of these packages read flat files from a network drive, format the data, and load the data into SQL Server tables.  Each step normally takes only about 30 seconds.  Occasionally, the same step that took 30 seconds would take over 5 minutes to process.  Sometimes a step would fail with a package execution error (no other clue given).

    It seems that after the server is rebooted, the job runs successfully.  If a package step fails due to some sort of resource limitation, shouldn't the error be noted somewhere?

    Saturday, July 15, 2006 1:32 AM

Answers

  • The memory is one issue, but SSIS can run even when it needs more memory than available memory, by swapping buffers from main memory to disk. It looks like there is another problem - failure to create temporary files to hold these buffers.

    As message says, the temporary files are created in directory specified by BufferTempStoragePath property of data flow task. By default it uses user's temporary files folder (%TEMP%), but can be overridden for each data flow task. There is a problem with this path, and data flow can't create temporary files there. Check that the path is correct, you have permissions to write to this path, and there is enough free space during package execution.
    Tuesday, July 18, 2006 7:03 AM

All replies

  •  Eric H. wrote:

    It seems that after the server is rebooted, the job runs successfully.  If a package step fails due to some sort of resource limitation, shouldn't the error be noted somewhere?

    Have you configured package logging? This should give more information about the error.

    Saturday, July 15, 2006 2:25 AM
  • Thanks Michael.  I went back into BIDS and defined package logging for one of the SSIS steps that was failing.  Most of the contents of the log contain these entries that repeat multiple times:

    The buffer manager detected that the system was low on virtual memory, but was unable to swap out any buffers. 1 buffers were considered and 0 were locked. Either not enough memory is available to the pipeline because not enough is installed, other processes are using it, or too many buffers are locked.

    The buffer manager cannot create a temporary storage file on any path in the BufferTempStoragePath property. There is an incorrect file name or no permission.

    So I guess it is a memory issue.  I don't know why all of a sudden I'm having issues with memory.  2 GB used to be enough and the job runs successfully after the server is rebooted.

    Tuesday, July 18, 2006 1:54 AM
  • Can you monitor memory usage by the various process on the box to see which is being the most greedy.

    What other processes are you running on that box, and can you limit their memory consumption, the obvious candidate being SQL Server. 

    Are you running packages in parallel, or can you serialise them to reduce the concurrent memory requirements.

    Are you using any resources and not closing or disposing of them properly, thereby not allowing memory to be released, or perhaps it is spening a long time waiting to be released.

     

     

    Tuesday, July 18, 2006 7:03 AM
  • The memory is one issue, but SSIS can run even when it needs more memory than available memory, by swapping buffers from main memory to disk. It looks like there is another problem - failure to create temporary files to hold these buffers.

    As message says, the temporary files are created in directory specified by BufferTempStoragePath property of data flow task. By default it uses user's temporary files folder (%TEMP%), but can be overridden for each data flow task. There is a problem with this path, and data flow can't create temporary files there. Check that the path is correct, you have permissions to write to this path, and there is enough free space during package execution.
    Tuesday, July 18, 2006 7:03 AM
  • Things are beginning to make sense now.
    1)  Our development server only has 2 GB free space on the C: drive.
    2)  The agent job runs under a generic SQL id
    3)  The SSIS package step with the problem runs with a proxy using a generic Windows ID that has access to the necessary network drives.  It also has membership to the SQLServer2005SQLAgentUser group, but it does not have a user's temporary files folder on the C drive.

    The only thing that was done to this particular server was that we changed agent job to run with a generic SQL id.  Other than that, I believe the C drive originally had about 4 GB free space.

    I'll try to relocate the BufferTempStoragePath to the D drive.  We usually keep our SQL Server databases and logs on the D drive for dev.  For prod, we put the databases on D and the logs on E.

    We don't (yet) have this problem on our prod server.  Is it most likely because it has 4 GB of memory or because of the 8.7 GB free space on the C drive?  Both servers are running the Standard Edition of SQL Server 2005.  Thanks again everyone for your assistance and insight!

    Tuesday, July 18, 2006 6:31 PM
  • Michael, it looks like the package uses the SQL Agent Job id as the user in your statement about the user's temporary files folder.  The package fails when the SQL Agent service is run with a generic windows id.  When we change the SQL Agent service back to Local admin, the package runs successfully.

    I don't know why the windows id would fail, I would think that it has access to its own %TEMP directory on the server.  Any other ideas?  We would prefer not to run the SQL Agent service with the Local admin account.  I looked at the Troubleshooting Performance Problems in SQL Server 2005 document, but it didn't shed any light into the situation.

    Wednesday, July 19, 2006 9:53 PM
  • I talked to Microsoft Support and they instructed me to put something in the Buffertempstoragepath.  In addition, I needed to define full folder permission to (1) the agent job service account (we don't use local admin) and (2) the proxy id that the package step uses.

    I asked where is it documented that the job will use the user's temp directory by default, but they couldn't point me to any particular document.  Tech support also told me that the owner of the job detemines the id that the job will use.  Through testing, it seems that for my environment the default Buffertempstorage path is the temp folder of the sql agent job service account id.  We are not going to enter a buffer location in all of our data flow steps; we are simply going to give the proxy id full permission to the temp folder of the agent job service account id.

    For reference, the error I received with package logging enabled is:

    The buffer manager cannot create a temporary storage file on any path in the BufferTempStoragePath property. There is an incorrect file name or no permission.

    Thursday, August 3, 2006 7:45 PM
  • I am having a similar problem. I have a single SSIS package containing 10 data flows. I have the data flows within a For each loop container which loops through 11 different sorted tables and executes the data flow. Within the data flow I am using variables within the OLE Db Source to concatenate the SQL with the table name which is picked up by the For Loop.

     

    I have specified a path in BufferTempStoragePath property. It is the Temp folder path of the user Id which I use to log on to SQL server. This error occurs when the SSIS package completes one data flow and is trying to start another. I have deployed other packages without a For each loop container and they run fine. I do not specify any path in the BufferTempStoragePath property.

     

    Please help me. This has become really frustrating.

     

     

    Wednesday, October 1, 2008 3:57 PM
  •  

    Hi,

     

    Despite of specifying a separate path for Buffer storage, I'm still encountering the following error.

     

     

    Setting BLOBTempStoragePath/BufferTempStoragePath in SSIS Config file:


      <Configuration ConfiguredType="Property" Path="\Package.Variables[User:Tongue TiedtrBLOBTempStoragePath].Properties[Value]" ValueType="String">
        <ConfiguredValue>X:\SSIS_LOG_N_BUFFER_AREA</ConfiguredValue>
      </Configuration>


      <Configuration ConfiguredType="Property" Path="\Package.Variables[User:Tongue TiedtrBufferTempStoragePath].Properties[Value]" ValueType="String">
        <ConfiguredValue>X:\SSIS_LOG_N_BUFFER_AREA</ConfiguredValue>
      </Configuration>

     

     

    Error Msg encountered

     

    OnError,,......The buffer manager cannot create a temporary storage file on any path in the BufferTempStoragePath property. There is an incorrect file name or no permission.

     

    Appreciate if anyone can shed some light on this one.

     

    Thanks much.

    Thursday, October 23, 2008 9:10 PM
  •  all of the suddent I started getting this error as well:

    The buffer manager cannot create a temporary storage file on any path in the BufferTempStoragePath property. There is an incorrect file name or no permission.

    or 

    There is not enough space on the disk.


    or

    The buffer manager cannot extend the file "C:\DOCUME~1\SQLAdmin\LOCALS~1\Temp\DTS6182.tmp" to length 1506324.  There was insufficient disk space.


    My problem was a record with multiple duplicates causing my lookup to cache millions of rows and I don't know how it happened but I just deleted them and now I'm back up and running.  So take a look at your lookups and see how many rows they are cacheing. 

    Good luck.

    Mike
    Wednesday, January 21, 2009 1:47 PM
  • http://accessaidprogramming.blogspot.com/2009/12/cant-create-necessary-temporary-file.html
    Friday, December 4, 2009 7:55 AM