locked
SSIS - Flat file getting locked RRS feed

  • Question

  • Hi,

    I have a data flow task and within the DFT, I have a flat file source connection. If there is an error in the source file/format, the task should fail and OnError event should execute and delete the file from the folder using File System Task setup within the event handler. The flat file connection mngr is setup dynamically through variables.

    Whenever, there is an issue within the flat file (format, data, and et al), the OnError event executes but the file system task to delete the file fails due to the file being used by another process. I see that OnError even doesn't execute AFTER the DFT fails. It executes DURING the DFT is processing that Flat file source (Orange circle while OnError is executing). So obviously, the Flat File Source is using/locking the .csv and OnError is trying to delete the file. How to get around this w/o any .net scripting?

    Monday, August 4, 2014 9:29 PM

All replies

  • Give a try by changing the Executables's(DFT) event handler from 'OnError' to 'OnTaskFailed'. (Please don't forget to delete the 'OnError' event handler.)
    Tuesday, August 5, 2014 12:22 PM
  • Thanks, a follow up question (w/ some relevant info in regards to this sln) I have is that, why can't I have both? Let's say I want the files to be deleted if the pkg fails ANYWHERE, in this case OnError would be the way to go. On top of that, if I have the same file system task JUST for that DFT, in OnTaskFailed event, would it cause any locking issue? (I would imagine the OnError and OnTaskFailed would execute W/ DFT fails but only OnTaskFailed would execute when other steps in the pkg would fail.)  
    Tuesday, August 5, 2014 3:02 PM
  • If the DFT is inside the For each loop container, then use the red precedence constraint between DFT and file system task (FST). The FST will responsible for deleting the file if the DFT fails...

    Thanks, hsbal

    Tuesday, August 5, 2014 6:12 PM
  • Is there a chance of parallel running of multiple instances of this package?

    Please Mark This As Answer if it helps to solve the issue Visakh ---------------------------- http://visakhm.blogspot.com/ https://www.facebook.com/VmBlogs

    Tuesday, August 5, 2014 6:45 PM
  • Hi,

    The DFT is not in any For(each) loop. The pkg is only running from Data tools and only by me within the single instance.

    Basically, instead of OnError event executing AFTER the Control Flow/Data Flow Task failure, it is running DURING Control Flow/Data Flow Task failure (and AFTER Control Flow/Data Flow Task/Flat File Source failure).

    OnError should be across the whole pkg. OnError should contain a FST that deletes the file. I am afraid, by having multiple instances of FST task that deletes the file (e.g. OnError & OnTaskFailed events) might/can lead to file locking as well.

    Tuesday, August 5, 2014 7:11 PM
  • Hi,

    The DFT is not in any For(each) loop. The pkg is only running from Data tools and only by me within the single instance.

    Basically, instead of OnError event executing AFTER the Control Flow/Data Flow Task failure, it is running DURING Control Flow/Data Flow Task failure (and AFTER Control Flow/Data Flow Task/Flat File Source failure).

    OnError should be across the whole pkg. OnError should contain a FST that deletes the file. I am afraid, by having multiple instances of FST task that deletes the file (e.g. OnError & OnTaskFailed events) might/can lead to file locking as well.

    I think using OnError to delete the file may not prove the correct approach because it may result in deleting the good file. I think so because the OnError event handler fires at the executable level (ie control flow task). The DFT has flat file source+transformation (if reqd)+ destination. If the file is good but either any transformation fails or destination fails, it will still delete the file, which I think you don't need.

    If this is correct, then use the error row redirection mechanism in case the file has bad data. This would help you to track the bad rows separately (in another flat file\table) and process the good records from the file.


    Thanks, hsbal

    Wednesday, August 6, 2014 3:00 PM
  • @Harry Bal: This pkg I am working on is processing only a single file with specific name (e.g. Names.csv). I am OK if the file gets deleted when ETL fails at any step, in fact. I had originally used failure constraint for the DFT but then we needed to delete the file if the pkg fails...anywhere. Also, if I am loading this file and it has 1000 rows and 999 rows are good and one is bad, I still want to delete the file @ the 1000th row failure. Hope this info helps.

    I hope there is a way to do what I am trying to do..hehe

    Wednesday, August 6, 2014 5:48 PM