Built in limit or setting that limits the number of rows from an OLE DB Source Table in a Data Flow Task? RRS feed

  • Question

  • I have a table that I'm loading as part of a control flow that in turn is copied to a target table by using a data flow task. I am doing this because a different set of fields may be used from the source entry to create the target entry based on a field in the source table. That same field may indicate that multiple entries need to be created in the target table from one source table entry for which I use a multi-cast transformation.

    The problem I'm having is that no matter how many entries there are in the source table, the OLE DB Source during execution only shows 7,532 entries being taken from the source table. If there are less than 7,532 entries in the source table, everything processes fine. More than 7,532 and the data flow task only takes 7,532 and then seems to hang. It also seems as though only one path of the multi-cast transformation is taken when the conditional split directs a source entry down that path.

    Seems like a strange problem I know, but any insight anyone could provide is appreciated. Thanks.

    Friday, February 16, 2007 11:55 PM


  • This problem was a manifestation of another problem resolved in a separate thread of mine. I had multiple paths to the same destination table which was creating a deadlock situation. I resolved it by clearing the "Table Lock" option of the OLE DB Destination. I am also looking into using a UNION ALL as an alternative, but I am not yet sure if that is a viable solution for this particular situation.
    Thursday, February 22, 2007 9:58 PM