none
SSIS Package - Data Conversion Error

    Question

  • Hello,

    I'm fairly new to SSIS. I have built a few simple packages for populating a SQL table with data stored in a flat file (pipe delimeted) or exporting table data to a flat file. I am working on a package now that is for populating a SQL table with data contained in a flat file (as mentioned above, I done this successfully before). I am receiveng an error stating "Data conversion failed. The data conversion for column "DailyAccrualAmt" returned status value 6 and status text "Conversion failed because the data value overflowed the specified type.". The column contains a value that has up to 8 digits to the left of the decimal point and up to 7 digits to the right of the decimal point. I have verified that both the field in my SQL table and the column settings in my Flat File Connection are both set to Numeric (8,7). I have also tried Decimal (8,7). I don't know why I'm receiveing this error. I also make sure the flat file and SQL table are both closed before trying to run the package. Any help in resolving this error will be greatly appreciated. The entire error message is below:

    Thank you,

    Dave

    SSIS package "LnServAccrualInfo.dtsx" starting.

    Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.

    Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.

    Information: 0x40043006 at Data Flow Task, DTS.Pipeline: Prepare for Execute phase is beginning.

    Information: 0x40043007 at Data Flow Task, DTS.Pipeline: Pre-Execute phase is beginning.

    Information: 0x402090DC at Data Flow Task, Flat File Source [1]: The processing of file "\\Fs-03246\Extract Generator Files\LnServAccrualInfo.asc" has started.

    Information: 0x4004300C at Data Flow Task, DTS.Pipeline: Execute phase is beginning.

    Error: 0xC02020A1 at Data Flow Task, Flat File Source [1]: Data conversion failed. The data conversion for column "DailyAccrualAmt" returned status value 6 and status text "Conversion failed because the data value overflowed the specified type.".

    Error: 0xC0209029 at Data Flow Task, Flat File Source [1]: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR.  The "output column "DailyAccrualAmt" (75)" failed because error code 0xC0209087 occurred, and the error row disposition on "output column "DailyAccrualAmt" (75)" specifies failure on error. An error occurred on the specified object of the specified component.  There may be error messages posted before this with more information about the failure.

    Error: 0xC0202092 at Data Flow Task, Flat File Source [1]: An error occurred while processing file "\\Fs-03246\Extract Generator Files\LnServAccrualInfo.asc" on data row 2.

    Error: 0xC0047038 at Data Flow Task: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "Flat File Source" (1) returned error code 0xC0202092.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure.

    Error: 0xC0047021 at Data Flow Task: SSIS Error Code DTS_E_THREADFAILED.  Thread "SourceThread0" has exited with error code 0xC0047038.  There may be error messages posted before this with more information on why the thread has exited.

    Error: 0xC0047039 at Data Flow Task: SSIS Error Code DTS_E_THREADCANCELLED.  Thread "WorkThread0" received a shutdown signal and is terminating. The user requested a shutdown, or an error in another thread is causing the pipeline to shutdown.  There may be error messages posted before this with more information on why the thread was cancelled.

    Error: 0xC0047021 at Data Flow Task: SSIS Error Code DTS_E_THREADFAILED.  Thread "WorkThread0" has exited with error code 0xC0047039.  There may be error messages posted before this with more information on why the thread has exited.

    Information: 0x40043008 at Data Flow Task, DTS.Pipeline: Post Execute phase is beginning.

    Information: 0x402090DD at Data Flow Task, Flat File Source [1]: The processing of file "\\Fs-03246\Extract Generator Files\LnServAccrualInfo.asc" has ended.

    Information: 0x402090DF at Data Flow Task, OLE DB Destination [9]: The final commit for the data insertion has started.

    Information: 0x402090E0 at Data Flow Task, OLE DB Destination [9]: The final commit for the data insertion has ended.

    Information: 0x40043009 at Data Flow Task, DTS.Pipeline: Cleanup phase is beginning.

    Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "OLE DB Destination" (9)" wrote 0 rows.

    Task failed: Data Flow Task

    Warning: 0x80019002 at LnServAccrualInfo: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED.  The Execution method succeeded, but the number of errors raised (7) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.

    SSIS package "LnServAccrualInfo.dtsx" finished: Failure.


    David Young

    Saturday, February 18, 2012 5:07 PM

Answers

  •  

    Hi Geoff,

    First off, let me say thank you. The issue is resolved.

    I first tried your suggestion with the Advanced Editor. There was a difference with the precision and scale. I corrected it and verified everything was the same in the SQL table, Flat File Connection Mgr, and the Advanced Editor properties. The package still failed. So I deleted the existing Connection Mgr and recreated it. I again verified the precision and scale were set the same in all three areas; oddly, the precision and scale were different in the Advanced Editor properties. I corrected the settings to be the same as the table and Connection Mgr. I saved my changes and then ran the package. This time it completed successfully.

    I didn't know about the Advanced Editor for the flat file source so you taught me something new which is awesome.

    I'm curious; any idea why the Advanced Editor did not pick up the same precision and scale from the SQL table and/or the Connection Mgr?

    I really appreciate all your feedback and help. It was a huge relief to see everything turn green when executing the package.

    Thanks again,

    Dave


    David Young

    • Marked as answer by MotoX_Dave Wednesday, February 22, 2012 4:29 PM
    Wednesday, February 22, 2012 4:29 PM
  • The flat file source is not 100% linked to the conection manager

    I'm not 100% sure on this but the way I have seen it work previously is that if you set up the connection manager and then link the flat file source to it, it will take through the metadata related to data types. If you then change the connection manager after this, that metadata may not update

    Because of this some people choose to leave the data types in the connection manager a bit "loose" and set the more restrictive data types in the flat file source - I like to ensure that the data types are consistent all the way through


    Rgds Geoff
    ----------------------------------------------------------
    Mark as Answer if this resolves your problem or "Vote as Helpful" if you find it helpful.

    • Marked as answer by MotoX_Dave Thursday, February 23, 2012 4:24 PM
    Wednesday, February 22, 2012 10:23 PM

All replies

  • You can try to route the error rows to another destination (maybe a flat file) using error handling.
    That way you can isolate the problem and take a good look at the offending rows.

    More info:

    http://msdn.microsoft.com/en-us/library/ms141679.aspx


    MCTS, MCITP - Please mark posts as answered where appropriate.
    Answer #1: Have you tried turning it off and on again?
    Answer #2: It depends...

    Saturday, February 18, 2012 9:19 PM
  • Hi Koen,

    I'll give the error handling a shot. I've never done it before so it should be a good learning experience. Thank you for the link,

    Dave


    David Young

    Saturday, February 18, 2012 11:06 PM
  • go for the error handling but in this case you would not need to

    a numeric (8,7) field has a precision of 8 and a scale of 7

    What that means is that the maximum total number of digits that can be stored (either to the left or the right of the dciml point is 8). you are the setting 7 of these to be for the decimal part therefore the max number you can hold is 9.9999999

    If you need 8 to the left and 7 to the right you need to set the data type to numeric(15,7)


    Rgds Geoff

    -----------------------------------------------------------

    Mark as Answer if this resolves your problem or "Vote as Helpful" if you find it helpful.


    Monday, February 20, 2012 5:09 AM
  • Hi Geoff,

    Thanks for your reply. I don't fully understand your explanation but I will read more on data types to get a better understanding. I will try your solution and let you know how it works out. I still want to try the error handling since I think it will be a good learning experience.

    Thanks again,

    Dave


    David Young

    Monday, February 20, 2012 5:29 AM
  • Hi Dave - certainly the error handling will be a good learning experience - it will be essential for moe complex projects

    The Numeric data type in SQL Server is defined in 2 elements

    Precision - this is the TOTAL number of digits that can be stored whether to the left or right of the decimal

    Scale - this is the number of digits that can be stored to the right of the decimal

    To be able to store the value 12345.6789

    you would need at least numeric(9,4)

    To be able to store the value 12345678.9

    you would need at least numeric(9,1)

    be able to store the value 123456789.123456789

    you would need at least numeric(18,9)


    Rgds Geoff

    -----------------------------------------------------------

    Mark as Answer if this resolves your problem or "Vote as Helpful" if you find it helpful.


    Monday, February 20, 2012 5:34 AM
  • Hi Geoff,

    I tried changing the precision and scale as you mentioned, but unfortunately I'm still receiving a data flow conversion error for the same field. The source file (pipe delimeted) is being generated by a Business Objects (BO) report (don't know if your familiar with BO). I'm going to go back into the BO report and check the format of the field which I believe is set to number but I'll double check it. I'll also see if I can find out if there is possibly some form of padding in the format of the field that I haven't detected, which would throw of the precision & scale. If that's not the issue, then I will try the error handing process as suggested by Koen. I will keep you posted.

    Thanks,

    Dave


    David Young

    Tuesday, February 21, 2012 2:49 PM
  • Hi Dave - have you updated the data type in SSIS as well?

    I am pretty familiar with BusObj and I doubt it is anything to do with the output format - from what I recall it outputs pretty default data types to either .txt or .csv files

    Concepts to understand are that data flow elements have their own data type definition which must be able to take the values you are passing as well as the eventual SQL table data type definition

    Data Type is normally set in the flat file conenction properties (in the advanced tab) and the metadata from that flows onwards through the package - only changng if the field is converted or manipulated

    What is the data type set up in the flat file connection and what is the data type in the SQL table?

    From what you describe I would be using Numeric(15,7) for the SQL table and DT_DECIMAL for the flat file connection


    Rgds Geoff
    ----------------------------------------------------------
    Mark as Answer if this resolves your problem or "Vote as Helpful" if you find it helpful.


    Wednesday, February 22, 2012 12:41 AM
  • Hi Geoff,

    You are right; it's not the data produced by Bus Obj. The output flat file is a pipe delimeted file (.asc) and I see no issues with the data.

    In the flat file connection manager, I have the field set as numeric [DT_NUMERIC](15,7) and the the SQL table is also set as Numeric(15,7). I did play around with changing the data types to Decimal but changed them both back to Numeric since Decimal didn't work.

    I'm definitely missing something and whatever I'm not seeing is the cause of my error. I actually built two other Bus Obj reports today and then SSIS packages populating SQL tables from those reports with no issues. They both worked flawlessly. So I'm at a loss with the error I am receiving at this point.

    I didn't get a chance today to try the error handling due to other work obligations such as the two new reports I mentioned above. I hope to have time tomorrow to give it a shot. 

    I appreciate your advice, and any other suggestions of a possible resolutiuon will be greatly appreciated. I give alot of credit to these forums and the people in them for giving their time to help people like me who are new to Integrated Services.

    Thank you,

    Dave


    David Young

    Wednesday, February 22, 2012 2:04 AM
  • ok -so the isue is certainly within SSIS metadata as opposed to the SQL table (based on the error message)

    Can you see what the biggest integer value and the biggest decimal value are in the DailyAccrualAmt field?

    Can you confirm if you have set eh data precision to 15 and data scale to 7?

    have you set these properties in the CONNECTION or in the FLAT FILE SOURCE (which would be the 1st element of your data flow)?


    Rgds Geoff
    ----------------------------------------------------------
    Mark as Answer if this resolves your problem or "Vote as Helpful" if you find it helpful.

    Wednesday, February 22, 2012 2:46 AM
  • Hi Geoff,

    I'll check for the largest integer and decimal values tomorrow morning at work.

    I have the precision set to 15 and the scale set to 7 in the advanced properties within the Flat File Connection Manager Editor.

    Thanks,

    Dave


    David Young

    Wednesday, February 22, 2012 4:14 AM
  • It will be worth right clicking on your flat file source element and choosing "Show Advanced Editor"

    Go to the input and output properties tab and check the data definition of the field in both the "External Columns" and "Output Columns" lists

    My guess is it needs to be updated there too - this will normally take the metadata from the connection manger when it is 1st linked up but may not update if you update the connection manager afterwards


    Rgds Geoff
    ----------------------------------------------------------
    Mark as Answer if this resolves your problem or "Vote as Helpful" if you find it helpful.

    Wednesday, February 22, 2012 5:24 AM
  •  

    Hi Geoff,

    First off, let me say thank you. The issue is resolved.

    I first tried your suggestion with the Advanced Editor. There was a difference with the precision and scale. I corrected it and verified everything was the same in the SQL table, Flat File Connection Mgr, and the Advanced Editor properties. The package still failed. So I deleted the existing Connection Mgr and recreated it. I again verified the precision and scale were set the same in all three areas; oddly, the precision and scale were different in the Advanced Editor properties. I corrected the settings to be the same as the table and Connection Mgr. I saved my changes and then ran the package. This time it completed successfully.

    I didn't know about the Advanced Editor for the flat file source so you taught me something new which is awesome.

    I'm curious; any idea why the Advanced Editor did not pick up the same precision and scale from the SQL table and/or the Connection Mgr?

    I really appreciate all your feedback and help. It was a huge relief to see everything turn green when executing the package.

    Thanks again,

    Dave


    David Young

    • Marked as answer by MotoX_Dave Wednesday, February 22, 2012 4:29 PM
    Wednesday, February 22, 2012 4:29 PM
  • The flat file source is not 100% linked to the conection manager

    I'm not 100% sure on this but the way I have seen it work previously is that if you set up the connection manager and then link the flat file source to it, it will take through the metadata related to data types. If you then change the connection manager after this, that metadata may not update

    Because of this some people choose to leave the data types in the connection manager a bit "loose" and set the more restrictive data types in the flat file source - I like to ensure that the data types are consistent all the way through


    Rgds Geoff
    ----------------------------------------------------------
    Mark as Answer if this resolves your problem or "Vote as Helpful" if you find it helpful.

    • Marked as answer by MotoX_Dave Thursday, February 23, 2012 4:24 PM
    Wednesday, February 22, 2012 10:23 PM
  • Thanks for the explanation Geoff. Since I may not remember; I made sure to document the steps taken to resolve this issue in case it happens agsain in the future.

    Dave


    David Young

    Thursday, February 23, 2012 4:25 PM