locked
Stoping the excution of SSIS Task? RRS feed

  • Question

  • Hi All,
    I have an SSIS package that take a collection of Flate files as input data. I use the Foreach compoment to read all files and after that passe the fale name to the Task.

    The task will use only files with the prefix = "CRM", so if the prefix is different than "CRM" I want ignore the file and use the next.
    My question is where I ignore the file , at the foreach compoment or in the task?

    Thx
    Thursday, December 10, 2009 12:42 PM

Answers

  • You should also be able to use a filter or file mask (similar if not the same as dos based filters) on the for each loop container to only grab files matching this criteria.

    For example
     *CRM*.* would find any file containing CRM
     CRM*.* would find any file with CRM as the prefix
     CRM??.* would find CRM followed by 2 characters and then an extension

    etc...

    Please mark answered posts. Thanks for your time.
    • Proposed as answer by COZYROC Friday, December 11, 2009 4:01 AM
    • Marked as answer by Zongqing Li Thursday, December 17, 2009 8:41 AM
    Thursday, December 10, 2009 3:02 PM

All replies

  • What is the format of the file name?
    Inside the For each loop take a dummy script task and connect it with data flow task  (Script--->data Flow).
    Then use precedence constraint by double clicking the connector and select :
    Evaluation Operation as Expression and Constraint
    Value                       as Suuccess
    Expression               as FINDSTRING(@[User::FileName],"CRM",1) >0
    Filename is the variable which is used to capture the name of the file (selcting name only option inside for each loop editor).

    Nitesh Rai- Please mark the post as answered if it answers your question
    Thursday, December 10, 2009 12:54 PM
  • You should also be able to use a filter or file mask (similar if not the same as dos based filters) on the for each loop container to only grab files matching this criteria.

    For example
     *CRM*.* would find any file containing CRM
     CRM*.* would find any file with CRM as the prefix
     CRM??.* would find CRM followed by 2 characters and then an extension

    etc...

    Please mark answered posts. Thanks for your time.
    • Proposed as answer by COZYROC Friday, December 11, 2009 4:01 AM
    • Marked as answer by Zongqing Li Thursday, December 17, 2009 8:41 AM
    Thursday, December 10, 2009 3:02 PM
  • Filter is Greate solution but haw I configure this for the Foreach compoment?

    and if into the Taskflow I have an invalide column value like for example I have the datetime into the source flate file as string with lenght equal to 8,
    if the length is less I would like to stop the execution of the TaskFlow?
    Thursday, December 10, 2009 5:11 PM
  • You can add the flter for the filename inside the foreach loop editor.
    Do you want to fail the package if there is an invaid column value for a column? The transformations inside Data Flow Task will fail if there is any invalid data. Like, if you are using a derived column for datetime column and the incoming column has length less than 8 then the component will fail due to its default setting of "Fail Component on Error".
    Nitesh Rai- Please mark the post as answered if it answers your question
    Thursday, December 10, 2009 5:25 PM
  • Filter is Greate solution but haw I configure this for the Foreach compoment?

    and if into the Taskflow I have an invalide column value like for example I have the datetime into the source flate file as string with lenght equal to 8,
    if the length is less I would like to stop the execution of the TaskFlow?

    You use the Foreach Loop with the Foreach File enumerator.  Then, where you list the "Files" under the collection tab you would set your filter (as mentioned above).

    As to the other question.  That is more an issue of data quality.  You could certainly trap the data that doesn't match your criteria and then potentially delete any records from a file that had an error. 

    Use a conditional split on the data quality issues.  Send the records that would fail this quality check to a row count.  Outside of the data flow, check to see if the row count variable is greater than zero, if it is delete the records with an execute sql task (use a precedence constraint).  In either case, then reset your variable back to zero so that you don't accidentally delete records from other files that are not good.

    Of course, I would really think using that method through, as you may still want the "good" records that pass the data quality checks.  In fact, I usually count the records that don't match and then pass them off to an error file or table.  The good records are still inserted, but the bad records are saved and the operator is notified so that the data can be cleaned up and rerun as a non-standard input.  However, this would not be good as you could potentially be reporting on bad aggregate data (i.e. not reporting the missing records while they are awaiting the manual interaction).  In either case, this should really be a business decision.

    Please mark answered posts. Thanks for your time.
    Thursday, December 10, 2009 5:26 PM