none
OLEDB Destination Error in SSIS Package not returning error column/desc

    Question

  • I have a SSIS package that reads data from a dump table, runs a custom script that takes date data and converts it to the correct format or nulls and formats amt fields to currency, then inserts it to a new table.  The new table redirects insert errors.  This process worked fine until about 3 weeks ago.  I am processing just under 6 million rows, with 460,000 or so insert errors that did give error column and code.

    Now, I am getting 1.5 million errors. and nothing has changed, to my knowledge.  I receive the following information.

    Error Code   -1071607685   Error Column   0   Error Desc   No status is available. 

    The only thing I can find for the above error code is

    DTS_E_OLEDBDESTINATIONADAPTERSTATIC_UNAVAILABLE

    To add to the confusion, I can not see any errors in the data written to the error table.  It appears that after a certain point is reached in the processing, everything, or most records, error out.

    Any help is appreciated.

    Thanks

    Derrick

    Tuesday, August 01, 2006 7:42 PM

Answers

  • A work around that can reduce the number of suspended rows is to use 2 OLE DB Destination adapters.

     

    The first OLE DB Destination adapter is configured to use 'Table or view - fast load' as the Data access mode. Use a Rows Per Batch setting of 10,000 and a Maximum Insert Commit Size of 10,000.  This handles the large volume work and suspends records in 10,000 row batches if there are any pk/fk violations.  Configure the Error Output to be routed to the second OLE DB Destination adapter.  In the second adapter, use 'Table or View' as the access mode.  This will insert the previously suspended records individually, only suspending the records that are truely pk/ fk violations.

     

    Good Luck.

    Wednesday, March 28, 2007 8:16 PM

All replies

  • Bump...  I'm having a similar issue where I cannot determine the insert problem.
    Monday, September 25, 2006 8:05 PM
  • you can create an error description output column, call a scripting component and do something like to following:

    Public Class ScriptMain
        Inherits UserComponent
        Public Overrides Sub Input0_ProcessInputRow(ByVal Row As Input0Buffer)
            'Use the incoming error number as a parameter to GetErrorDescription
            Row.ErrorDescription = ComponentMetaData.GetErrorDescription(Row.ErrorCode)
        End Sub
    End
    Class

     

    Hammer

    Wednesday, September 27, 2006 9:36 PM
  • Resurrecting this.

    The error message is "No status available."

    Using Fast Load option.

    If I take the data in error and try to insert it into the table, it works fine.  It's not data related.
    Tuesday, December 05, 2006 5:59 PM
  • I have the same issue, The Error code is not descriptive and the Column Number I get is -1. I know the column with error is 81 but is not being reported correctly. How can we do to report he correct column? The correct error too?
    Friday, January 19, 2007 6:57 PM
  • Did anyone get this problem resolved?   I am also seeing records on one of my SSIS packages failing to load with this error.
    Monday, February 12, 2007 8:31 PM
  •  MikeZ wrote:
    Did anyone get this problem resolved? I am also seeing records on one of my SSIS packages failing to load with this error.


    Because "Fast Load" is a bulk load, I'm not sure that we can get meaningful messages out of the destination component.  If you drop the fast load option, does the load still fail?
    Monday, February 12, 2007 8:34 PM
  • Hi Phil,

    Actually I'm not using Fast Load - just a standard OLE DB destination with a normal table load.

    I have error rows from the transform going to a table, and in that table I record the ErrorCode and ErrorColumn values. Error Code is always -1071607685, and Error Column is -1.   I presume -1 means it's a problem with the row as a whole, i.e. some kind of constraint or index.

    This is only occuring for one of my 85+ packages... very odd!  

    Interestingly I can type the data in and it works, just not through the SSIS package.

    Mike

     

    Monday, February 12, 2007 9:01 PM
  • Okay - I fixed my problem. 

    Turns out that there was a data type difference on one column between the source and destination.  However using the OLE DB Destination component didn't tell me about the problem. When I tried replacing it to use the SQL Destination component instead, I saw the proper error message to tell me that the column data types were inconsistant.

    So to resolve the problem I just did a CAST() on the source sql statement and then everything was good again (and I went back to the OLE DB component).

    Hope this helps someone else in the future.

    Cheers!
    Mike

    Monday, February 12, 2007 9:34 PM
  • Phil,

    I'm not using Fast Load either - just a standard OLE DB destination with a normal table load. Fast Load is not allowed when you are redirecting rows when failing.

    I hope this helps!

    Tuesday, February 20, 2007 8:51 PM
  •  Paulino PP wrote:

    Phil,

    I'm not using Fast Load either - just a standard OLE DB destination with a normal table load. Fast Load is not allowed when you are redirecting rows when failing.

    I hope this helps!



    Yes, I get all of that.  However, you can redirect rows with fast load.  You won't be able to capture individual rows though, unless MaxInsertCommitSize is 1.  In the case of fast load, the whole batch will get redirected if one row fails.
    Tuesday, February 20, 2007 9:00 PM
  • I believe that this occurs any time there is a data base error.  In my original post, it was caused by a duplicate key error on an insert, I have also seen it for data type conversion issues as well as invalid date/timestamps.  In some cases, it is a specific column causing the problem so  you would hope it would be able to identify that column but it does not.

     

    Tuesday, February 20, 2007 9:18 PM
  • Hi,

       Where we need to give the CAST() option. Can you send methe samplre SQL query for how we need to use CAST().

    Thanks,

    Ram,

    Wednesday, February 28, 2007 11:36 PM
  • A work around that can reduce the number of suspended rows is to use 2 OLE DB Destination adapters.

     

    The first OLE DB Destination adapter is configured to use 'Table or view - fast load' as the Data access mode. Use a Rows Per Batch setting of 10,000 and a Maximum Insert Commit Size of 10,000.  This handles the large volume work and suspends records in 10,000 row batches if there are any pk/fk violations.  Configure the Error Output to be routed to the second OLE DB Destination adapter.  In the second adapter, use 'Table or View' as the access mode.  This will insert the previously suspended records individually, only suspending the records that are truely pk/ fk violations.

     

    Good Luck.

    Wednesday, March 28, 2007 8:16 PM
  • Jim,

    This suggestion was great. I had sacrificed speed to be able to get these bad rows, but now, I am back to fast loads and still able to catch those bad records on the second insert.

    Thanks!

    Monday, April 09, 2007 2:02 PM
  • This was very helpful and solved the original problem.  However, we started receiving this error again after the suggested fix and after a refresh of our database.  The root cause for the new occurrence of the error (Error ID: -1071607685: Description: DTS_E_OLEDBDESTINATIONADAPTERSTATIC) was a default value on a field that was NOT being mapped in the SSIS package and the default value was violating a FK constraint so it appears that SQL Server was getting the error but not providing the typical FK constraint error message through SSIS.

     

    Sunday, December 02, 2007 7:01 PM
  • I was using fast load.  Removed and used table view - the error didn't appear.  Table view is much slower though.
    Tuesday, November 23, 2010 5:28 PM
  • Hi,

    If first oledb provider uses bulk insert and second oledb uses normal access, it slows down the performance. Let me know what can be done to capture duplicate records and to fasten the bulk load (more than 150 million records)

    Monday, December 06, 2010 10:32 AM
  • You can put a Lookup between the two OLEDB destinations, configuring it to take its data from the first OLEDB destination's error output and send unmatched rows to a No Match output. When the first OLEDB destination, using Fast Load, hits a duplicate record it causes the whole batch of 10,000 records to be sent to the Lookup, which checks whether the keys for these rows already exist in the destination table (lookup on the key column against the destination table, usually using partial caching and not caching misses). Ignore the Lookup Match Output and connect the Lookup No Match Output to the second OLEDB destination. Then the second OLEDB destination does not retry the whole 10,000 rows, just those that aren't duplicates of anything in the database.

    In principle, if you do this you could use Fast Load on the second OLEDB destination as well as the first. But I tend to disable Fast Load on the second destination and have it either send its errored rows to an error table or fail on error. Then you will learn about any error that is not due to a duplicate record.

    Monday, May 30, 2011 6:12 PM
  • I know it's a couple of years later but maybe I'll help someone getting here form a web search (which I did).

    For me the reason seems to be a IDENTITY column. With a IDENTITY column the fast load fails but without it doesn't.

    Tuesday, March 26, 2013 2:24 PM