Wednesday, March 07, 2012 8:24 PM
I have an SSIS Package which loads fro FlatFile Source to SQL Server Destination(using OLEDB Destination). The Flat File has almost 40 Columns(~90k Records). I am using 15 Derived Columns and 15 Unionall components in DataFlow.
The issue is when i running on local machine it was running fine. When i deployed and start running as Job on server it was failing due to this error .I am not using any Package Configurations.
"The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020."
So can any face the similar issue. Do i need to use extra steps while deploying.Any help is appreciated
Wednesday, March 07, 2012 8:30 PMModeratorLooks like a time out, cuz you use so many UnionAlls, nevertheless the fixes are in this link: http://social.msdn.microsoft.com/Forums/en/sqlintegrationservices/thread/8a3558df-221a-45e6-8e08-c23c987c12a9
Arthur My Blog
Wednesday, March 07, 2012 8:51 PM
Thanks for Reply.
In the link you gave suggests like this
"I guess the problem is in Time Out, because after I changed OLEDB Destination to SQL Server Destination, I still got that error, but after I changed Time Out to higher value, the package can be executed successfully and all the records are inserted to destination table."
I checked the Timeout in my OLEDB Destination like this
It was set to 0(zero) which means infinite time-out.
According to link you gave it suggests to increase time out. I think 0 is max. So how can i increase. Is there any better practices for time-out. Do i seeing correct properties
?? what is the best timeout value??
- Edited by SSIS Dada Wednesday, March 07, 2012 8:51 PM
Wednesday, March 14, 2012 8:57 AMModeratorHi Okkadu1,
The error is a general message, and it may cause by limited memory, please change the DefaultBufferMaxRows and the DefaultBufferSize to suitable value, for more information about it, please see: http://msdn.microsoft.com/en-us/library/ms141031.aspx
And here is a similar thread, please refer to:
Wednesday, March 14, 2012 9:49 AM
40 columns and 90k records is not that huge
Issue is most likely with the union all components - what purpose are they serving?
Mark as Answer if this resolves your problem or "Vote as Helpful" if you find it helpful.
Wednesday, March 14, 2012 1:21 PMModeratorI would recommend changing the package design. It is because to just upload a flat file to a database should not take so many union all. I do not know what these unions are to, but it seems it would more prudent to just upload the file to a staging table and then from there transform the data using the database faculties, this way we eliminate the error for sure.
Arthur My Blog
Wednesday, March 14, 2012 2:11 PM
Hi Guys ,
Thanks for your valuable suggestions. Actually my problem got resolved on that day by changing Default BufferSize Propertie of DataflowTask. I increased it by 10 times and it starts working.