Answered by:
DTS_E_INDUCEDTRANSFORMFAILUREONERROR - SQL Server 2005 -> Excel 2007

Question
-
Hello, I have been wrestling with trying to transfer SQL Server 2005 (OLEDB Source) data into Excel (OLEDB Destination) using the Excel 12.0 office driver (as I have more than 65000 rows). I also have a NTEXT column included from the SQL Server data (for this I have made the registry change to scan more than the default 8 rows before deciding data type). The procedure I am using has worked for two other packages involving imports into Excel (which deals with different data however they also include some NTEXT columns and have imported into Excel fine - the difference is however is these are less than 65000 rows)
The error message I am receiving is :-
[OLE DB Destination [11400]] Error: SSIS Error Code DTS_E_OLEDBERROR. An OLE DB error has occurred. Error code: 0x80004005.
[OLE DB Destination [11400]] Error: SSIS Error Code DTS_E_INDUCEDTRANSFORMFAILUREONERROR. The "input "OLE DB Destination Input" (11413)" failed because error code 0xC020907B occurred, and the error row disposition on "input "OLE DB Destination Input" (11413)" specifies failure on error. An error occurred on the specified object of the specified component. There may be error messages posted before this with more information about the failure.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_PROCESSINPUTFAILED. The ProcessInput method on component "OLE DB Destination" (11400) failed with error code 0xC0209029. The identified component returned an error from the ProcessInput method. The error is specific to the component, but the error is fatal and will cause the Data Flow task to stop running. There may be error messages posted before this with more information about the failure.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "WorkThread0" has exited with error code 0xC0209029. There may be error messages posted before this with more information on why the thread has exited.
[OLE DB Source [11310]] Error: The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_PRIMEOUTPUTFAILED. The PrimeOutput method on component "OLE DB Source" (11310) returned error code 0xC02020C4. The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing. There may be error messages posted before this with more information about the failure.
[DTS.Pipeline] Error: SSIS Error Code DTS_E_THREADFAILED. Thread "SourceThread0" has exited with error code 0xC0047038. There may be error messages posted before this with more information on why the thread has exited.
It falls over after around 3800 rows - I have tried a redirect error file however all rows (around 70000) are routed into the redirection file with error code "-1071607685" and a -1 for column which is not very informative - further the above message doesn't give any indication of column(s) at fault - I have tried to eliminate certain columns but it makes no difference
Does anyone have any idea ?
Many thanks
Travis
Thursday, October 23, 2008 2:28 AM
Answers
-
Hi Roman, many thanks for your suggestions - it convinced me to seek an alternate path to using the Excel driver (which is problematic in so many ways - NTEXT fields, deleting/recreating the export file, 32-bit/64-bit, deployment to server). I finished up using XSLT to generate an XML file in SpreadsheetML format. I am now able to seemlessly generate an export file that imports nicely into Excel regardless of the field types, number of rows, etc - the XSLT is generic and hence can be used without knowing the number of columns, column-names, etc
Thanks again
Travis
Friday, October 24, 2008 3:44 PM
All replies
-
HelloTry to do the following things:1. Calculate the memory needed for your 65.000 rows2. Increase DefaultBufferSize of DataFlowTask (by default it's set to 10MB)Thursday, October 23, 2008 9:31 AMAnswerer
-
Much appreciated Roman - this did make a difference - I had a play with the DefaultBufferSize and DefaultBufferMaxRows and it processed around 38,000 rows (rather then previously when it would only process 3,800) before falling over. However it is still failing - I am not concerned with performance as the job runs overnight. Is there a way to make it move smaller chunks under the threshholds so that it completes successfully. I tried changing both the DefaultBufferSize and DefaultBufferMaxRows to higher and smaller amounts to no avail.
Many thanks
Travis
Thursday, October 23, 2008 10:53 AM -
Yes can use the following algorithm for processing "page by page".1. You retrieve the pages number to process supposing there are 10.000 rows by page :Code Snippet
select (COUNT(*) / 10000) + (CASE WHEN (COUNT(*) % 10000) > 0 THEN 1 ELSE 0 END) from MySourceTable
2. In ForeachLoop (one page by iteration) you retrieve 10.000 lines by page number using ROW_NUMBER() featureROW_NUMBER (Transact-SQL)Thursday, October 23, 2008 1:17 PMAnswerer -
Roman, many thanks for the reply - manually iterating thru the results of the query returning batches of records will work - however I was really looking more of an SSIS/buffering type solution - I guess I am just staggered about the number of shortfalls within SSIS that I have already had to overcome to even get to this point to only be left with having to apply a (fairly time-consuming) workaround yet again
Is there any other alternative to move large amounts of data from SQL Server 2007 to Excel 2007
Cheers
Travis
Thursday, October 23, 2008 2:41 PM -
Try to :
1. Retrieve max length of NTEXT columns values2. Cast your NTEXT columns to varchar(max_length_ntext_columns) it can reduce buffer allocations- Proposed as answer by vkeg Wednesday, July 15, 2009 10:04 AM
Thursday, October 23, 2008 3:30 PMAnswerer -
Hi Roman, many thanks for your suggestions - it convinced me to seek an alternate path to using the Excel driver (which is problematic in so many ways - NTEXT fields, deleting/recreating the export file, 32-bit/64-bit, deployment to server). I finished up using XSLT to generate an XML file in SpreadsheetML format. I am now able to seemlessly generate an export file that imports nicely into Excel regardless of the field types, number of rows, etc - the XSLT is generic and hence can be used without knowing the number of columns, column-names, etc
Thanks again
Travis
Friday, October 24, 2008 3:44 PM -
Hi, I ran into similar problem. I don't have ntext field but many columns data to be extracted for about 400K rows, I use Tasks import/export from a query source and write output excel file and hopefully it will take care of limiting each page to 65k rows, it didn't instead it throw error after get to max excel page row.
can you give me a little more detail when/where should I use code snippet and follow your suggestion with row_number if I use Tasks import/export or if I create the SSIS package for it.
Many Thanks in Advance,
Mydieu
mnguyen@foundation.fsu.edu
the10swacky@hotmail.com
NewKidinReportingServicesBlock- Proposed as answer by vkeg Wednesday, July 15, 2009 10:04 AM
Thursday, July 2, 2009 1:34 PM -
....................Wednesday, July 15, 2009 10:05 AM