none
Error converting data type DBTYPE_DBTIMESTAMP to datetime

    Question

  • Hi

    I am trying to transfer data from Access 2000 database to SQL server 2005 via SSIS. My access database table has a field with data type Date/Time and SQL server table has a field with data type datetime. When I am running the SSIS package, it is throwing the following error and stops execution.

    [SQL Server Destination [12466]] Error: An OLE DB error has occurred. Error code: 0x80040E07. An OLE DB record is available.  Source: "Microsoft OLE DB Provider for SQL Server"  Hresult: 0x80040E07  Description: "Error converting data type DBTYPE_DBTIMESTAMP to datetime.".

    Please help.

    Thanks in advance,
    Ramzee

    Thursday, May 04, 2006 9:00 AM

Answers

  • According to BOL (http://msdn2.microsoft.com/en-us/library/ms175970.aspx) the types can be converted, so that should work. Have you identified the values that fail? Using the error output is a very easy way to capture this data. My best guess for now is the values are out of range for the SQL datetime type. A derived column transformation could be used to massage these values depending on requirements.
    Thursday, May 04, 2006 2:38 PM
    Moderator

All replies

  • According to BOL (http://msdn2.microsoft.com/en-us/library/ms175970.aspx) the types can be converted, so that should work. Have you identified the values that fail? Using the error output is a very easy way to capture this data. My best guess for now is the values are out of range for the SQL datetime type. A derived column transformation could be used to massage these values depending on requirements.
    Thursday, May 04, 2006 2:38 PM
    Moderator
  • Thanks Darren!

    I used the script component to identify the wrong date entries. Now it works!

    Thanks once again.

    Friday, May 05, 2006 6:51 AM
  • I am having the same error.  A simple select from Query Analyzer

    Select * from HCPROD..HC.UNIT U

    Even if I specifically reference a number column I get the same error

    Select BLOCK_ID from HCPROD..HC.UNIT U

    The way I see it is that there is a column, probably the DATE_MODIFIED column, somewhere in the database that has a date & time value that the driver cannot process.  Why the error occurs even if I do not reference the column is a mystery.

    I am using Microsoft OLE DB Provider for Oracle

     Note that I have insured that every date field in the table is in 20th or 21st century

     

     

    Tuesday, June 27, 2006 6:14 PM
  • Some more information.  We are using Oracle version 10g.   I wonder if there is an updated driver from Microsoft?  We searched and could not find one ourselves.
    Tuesday, June 27, 2006 8:08 PM
  • SELECT TOP 10 * FROM [ORACLE_LOOKUP]..[SchemaName].[TableName] AlaisName

    Error converting data type DBTYPE_DBTIMESTAMP to datetime

    SELECT * FROM
    OPENQUERY (
    ORACLE_LOOKUP,
    'SELECT TO_CHAR(DateColumnName)  FROM TableName')

    The following workaround was foud on the internet

    A workaround would be to use the OPENQUERY function in SQL Server. This
    function allows you to execute a pass-through query on the linked server.
    By issuing a pass-through query, you can then take advantage of Oracle's
    built-in functions to convert the date to a character data type or NULL.
    Examples of both are as follows:

    In the following example, the column "hiredate" is converted to a string in
    the format of MM-DD-YYYY.

    SELECT *
    FROM OPENQUERY(, 'SELECT empno,
    TO_CHAR(hiredate,''MM-DD-YYYY'') FROM scott.emp')

    In this example we use a combination of two Oracle built-in functions,
    DECODE and GREATEST to convert any hiredate that is earlier than 01/01/1753
    (SQL Server's lower bound) to NULL.

    SELECT *
    FROM OPENQUERY(,'SELECT DECODE (hiredate,
    GREATEST(hiredate, TO_DATE(''01/01/1753'',''MM/DD/YYYY'')), hiredate, NULL)
    FROM scott.emp')

    In case you are not familiar with Oracle built-ins, here is a description of
    the two used in the query above.

    DECODE is similar to a case statement in SQL Server. Its syntax is as
    follows :

    DECODE(, , , , ,...,
    )

    The is compared to each value. If there is a match,
    the corresponding is returned. Otherwise, the value is
    returned.

    GREATEST returns the greatest value in the list of expressions.

    GREATEST(, , ...)
    Reply With Quote
    Thursday, May 24, 2007 11:03 AM
  • May I ask how you did it? (in case you remember, after such a long time...)

    What did you do in the script component? Some kind of regular expression to find the "bad" date values?
    Tuesday, March 04, 2008 12:14 AM
  • You could just use a simple IF to check for out of range values - 1753/01/01 - 9999/12/31 is the range supported by the datetime type in SQL Server.

     

    Or use a derived column expression to replace it with NULL:

     

    Code Snippet
    [YourColumn] < (DT_DBTIMESTAMP)"1753/01/01" ?  NULL(DT_DBTIMESTAMP) : [YourColumn] > (DT_DBTIMESTAMP)"9999/12/31" ?  NULL(DT_DBTIMESTAMP) : [YourColumn]

     

     

     

    Tuesday, March 04, 2008 12:30 AM
    Moderator
  • Hate to resurrect an old thread, but this is the top google result and I'm looking for a resolution, not how to use this website. But can anyone say where the conversion happens? There is no error, at all. None of my values are errors. I removed fields one at a time until I isolated the field. I changed one field to use TO_CHAR instead of implicit datetime conversion: SELECT * INTO #temptable FROM OPENQUERY(LINKED_SVR, ' SELECT TO_CHAR(tablename.fieldname, ''MM/DD/YYYY HH:MI:SS'') as fieldname, ... WHERE ... ') then select the field (this is the TO_CHAR result, varchar): select field_name from #temptable field_name 03/02/2009 04:39:51 03/02/2009 04:39:51 03/02/2009 04:39:51 04/16/2009 02:38:35 04/16/2009 02:38:35 04/16/2009 02:38:35 (6 row(s) affected) then ask to convert dates to see if there are errors which are not visible to the eye: select cast (field_name as datetime) from #temp_table 2009-03-02 04:39:51.000 2009-03-02 04:39:51.000 2009-03-02 04:39:51.000 2009-04-16 02:38:35.000 2009-04-16 02:38:35.000 2009-04-16 02:38:35.000 (6 row(s) affected) Why I'm asking? SELECT... WHERE field > '1600-01-01' converts criteria to match type of field, and gives an error due to code being wrong. SELECT... WHERE field > cast('2000-01-01' as datetime) seems to convert field to datetime for comparison, and records with invalid dates abort the query due to fields being wrong. But SELECT fieldname with no criteria or casting should return raw format value. Since all fields are valid dates, conversion should not error. Not error in the where clause, not error in criteria implicit conversion, but on MS SQL side. But running step by step as above gives no errors. If I know where conversion happens I can stop that step. Seems an error in transport, maybe it is Oracle driver problem. But again, same as my last question - if SQL knows which error is not valid, can it not report the error details? That way I know which field, which value. Reinvent the wheel to search for every possible value, when it could be in criteria or even in records I don't want simply because forced conversion fails... it has taken me a lot of time to find. Scripting the temporary table (using syscolumns) says the TO_CHAR results are converted automatically to [varchar](19) NULL, but if I do not specify conversion it is [datetime] so I think cast (field_name as datetime) is a valid test. I will stop rambling now. There are far too many search results for me to have success figuring this out. adding TO_CHAR to every possible datetime field when the data is valid is unnecessary, makes hard to read code, and if you miss one it can become (has become) a production support issue with no root cause - all fields valid, workaround applied, why workaround is needed is not explainable. Curtis von Spritzel
    Saturday, April 25, 2009 6:30 PM
  • reformatted.  Lesson: always copy before submitting.  I might have checked "has code" box - my question has code so I did.  Did you mean "my example has html code" instead?

    Hate to resurrect an old thread, but this is the top google result and I'm looking for a resolution, not how to use this website.  But can anyone say where the conversion happens?  There is no error, at all.  None of my values are errors. 

    I removed fields one at a time until I isolated the field.  I changed one field to use TO_CHAR instead of implicit datetime conversion:

        SELECT *
        INTO #temptable
        FROM OPENQUERY(LINKED_SVR, ' SELECT
            TO_CHAR(tablename.fieldname, ''MM/DD/YYYY HH:MI:SS'') as fieldname,
            ...
            WHERE ...
        ')

    then select the field (this is the TO_CHAR result, varchar):

        select field_name
        from #temptable
       
        field_name
        03/02/2009 04:39:51
        03/02/2009 04:39:51
        03/02/2009 04:39:51
        04/16/2009 02:38:35
        04/16/2009 02:38:35
        04/16/2009 02:38:35
       
        (6 row(s) affected)
       
    then ask to convert dates to see if there are errors which are not visible to the eye:

        select cast (field_name as datetime)
        from #temp_table

       
        2009-03-02 04:39:51.000
        2009-03-02 04:39:51.000
        2009-03-02 04:39:51.000
        2009-04-16 02:38:35.000
        2009-04-16 02:38:35.000
        2009-04-16 02:38:35.000
       
        (6 row(s) affected)

    Why I'm asking?  SELECT... WHERE field > '1600-01-01' converts criteria to match type of field, and gives an error due to code being wrong.   SELECT... WHERE field > cast('2000-01-01' as datetime) seems to convert field to datetime for comparison, and records with invalid dates abort the query due to fields being wrong.  But SELECT fieldname with no criteria or casting should return raw format value.  Since all fields are valid dates, conversion should not error.  Not error in the where clause, not error in criteria implicit conversion, but on MS SQL side.  But running step by step as above gives no errors.  If I know where conversion happens I can stop that step.  Seems an error in transport, maybe it is Oracle driver problem.  But again, same as my last question - if SQL knows which error is not valid, can it not report the error details?  That way I know which field, which value.  Reinvent the wheel to search for every possible value, when it could be in criteria or even in records I don't want simply because forced conversion fails... it has taken me a lot of time to find.

    Scripting the temporary table (using syscolumns) says the TO_CHAR results are converted automatically to [varchar](19) NULL, but if I do not specify conversion it is [datetime] so I think cast (field_name as datetime) is a valid test.

    I will stop rambling now.  There are far too many search results for me to have success figuring this out.  adding TO_CHAR to every possible datetime field when the data is valid is unnecessary, makes hard to read code, and if you miss one it can become (has become) a production support issue with no root cause - all fields valid, workaround applied, why workaround is needed is not explainable.

    Curtis von Spritzel
    Saturday, April 25, 2009 6:31 PM
  • Yes this is a very very annoying problem.

    I created a package for a couple of months back and suddenly out of the blue it started to fail with that stupid error. In my case there was only 18000rows and only one date time column. But why on earth does SSIS interpret DB Datetime as DB Timestamp? DB Timestamp is something completely different!!

    Anyway, I did manage to do it by converting the data type. First in the OLE DB I created a convert(varchar, dtfield, 13) and then in the Data Flow Transformation Data Conversion I converted that varchar back to DB_timestamp ( strange )... but it worked.


    But no matter what... something is just plain wrong in SSIS regarding this.
    Tuesday, July 14, 2009 5:52 PM