locked
filter out the records that are violating constraints RRS feed

  • Question

  • Hi,

    In DFT,how to filter out the records that are violating constraints?

     

    Thanks,

    Pramod


    ~Technology Analyst~
    Monday, August 23, 2010 11:15 AM

Answers

  • On the Destination connection, you need to switch the Data Access Mode from "Table or view fast load" to "Table or View". Then as Victor points out, drag the red arrow that eminates from that Destination Connetion and drop it onto another Task, like a Row Count or even a different Destination (I use Flat Files for this). Then go and set the Error Configuration to "Redirect Rows" for all the columns in question.

    However, if you have Constraints on the table, this will slow down the load significantly. Test it out and see if the performance hit is acceptable to your situation.

    Another approach might be to scrub the data even BEFORE it gets to the Destination, like using Lookups, etc.


    Todd C - MSCTS SQL Server 2005 - Please mark posts as answered where appropriate.
    Monday, August 23, 2010 12:33 PM
  • when you have constraints in destination table, if any row violate this constraint the Data Flow return an error and package fails.
    you should configure Failed Error Output to handle those rows ( for example redirect them or ignore them )
    I wrote a simple article here about this exactly, read it :
    http://rad.pasfu.com/index.php?/archives/23-How-to-handle-Failed-Rows-in-a-Data-Flow.html


    http://www.rad.pasfu.com
    Monday, August 23, 2010 2:18 PM

All replies

  • DFT Department for Transport? :D

    On dataflow task you can filter the records that generates some constraint check error redirecting the rows that can't load up. On OLEDB Destination component, on Error Output pane, can set each field behavior on errors.

    If you´re looking for prevent PK errors, also can check it trying to obtain the value to insert throug Lookup component or getting the destination as source and matching with the current data flow with a merge join.

     


    Víctor M. Sánchez García (ES) (BI) Hope this help. if this answer your question, please mark as it.
    Monday, August 23, 2010 11:31 AM
  • On the Destination connection, you need to switch the Data Access Mode from "Table or view fast load" to "Table or View". Then as Victor points out, drag the red arrow that eminates from that Destination Connetion and drop it onto another Task, like a Row Count or even a different Destination (I use Flat Files for this). Then go and set the Error Configuration to "Redirect Rows" for all the columns in question.

    However, if you have Constraints on the table, this will slow down the load significantly. Test it out and see if the performance hit is acceptable to your situation.

    Another approach might be to scrub the data even BEFORE it gets to the Destination, like using Lookups, etc.


    Todd C - MSCTS SQL Server 2005 - Please mark posts as answered where appropriate.
    Monday, August 23, 2010 12:33 PM
  • when you have constraints in destination table, if any row violate this constraint the Data Flow return an error and package fails.
    you should configure Failed Error Output to handle those rows ( for example redirect them or ignore them )
    I wrote a simple article here about this exactly, read it :
    http://rad.pasfu.com/index.php?/archives/23-How-to-handle-Failed-Rows-in-a-Data-Flow.html


    http://www.rad.pasfu.com
    Monday, August 23, 2010 2:18 PM
  • Good article Reza and thanks for shairing. This is helpful for me as well.

    - Prabhat
    Monday, August 23, 2010 2:26 PM