locked
buffer RRS feed

  • Question

  • Am getting this error when I try to read from a table and the dataflow task  just pause:

    I have a oldbe source followed by a sort transformation and then look up and destination :  the package works file in a development environment but when I move to product it just get stack and  

    I used buffertempstorage but still get stack….. any help!!!!!!!!!!! please

     

    First  error

    -1071636284

    The attempt to add a row to the Data Flow task buffer failed with error code 0xC0047020.   

     

     

    Second error

    -1073450952       0x           SSIS Error Code DTS_E_PRIMEOUTPUTFAILED.  The PrimeOutput method on component "OLE DB Source" (1) returned error code 0xC02020C4.  The component returned a failure code when the pipeline engine called PrimeOutput(). The meaning of the failure code is defined by the component, but the error is fatal and the pipeline stopped executing.  There may be error messages posted before this with more information about the failure. 


    kkkk
    Friday, May 1, 2009 6:04 PM

Answers

All replies

  • I am guessing that the sort and lkp transformation are maxing out memory resources in the prod server. How many rows/columns are you trying to pass through the sort? If possible, I would get rid of the sort transformation and add an order by clause in the query. In the lookup, make sure you provide a query with only the required columns to perform the lookup operation, and see if you can add a where clause to limit the number of rows in the lookup cache.


    What other transformation components do you have in that dataflow, it may be worth it to a performance sanity check of the whole dataflow.
    Rafael Salas | Don’t forget to mark the post(s) that answered your question http://rafael-salas.blogspot.com/
    Friday, May 1, 2009 9:40 PM
  • well in the sort transformation am passing about 6 columns and  also it removes duplicate rows. and as for look up am only passing the required columns ... the package are designed by somebody elase amd just implementing.... and the other transformaiton i have is  OlEDB destination .

    kkkk
    Saturday, May 2, 2009 9:35 PM
  • Sort transformation is resource intensive. I would look into other alternatives to remove the duplicates:

    http://rafael-salas.blogspot.com/2007/04/remove-duplicates-using-t-sql-rank.html

    Eliminating Duplicate Primary Keys in SSIS.

    other think to check is that the ole db destination is using fast load and that commit size and rows per batch are set to a manageable number (e.g 10k)

    try that and let us know.


    Rafael Salas | Don’t forget to mark the post(s) that answered your question http://rafael-salas.blogspot.com/
    • Proposed as answer by Rafael SalasMVP Wednesday, May 6, 2009 10:26 PM
    • Marked as answer by Bob Bojanic Thursday, May 7, 2009 5:32 PM
    Sunday, May 3, 2009 1:39 AM
  • thanks so much ..last week i replaced the sort transformation with rank trasforamtion and conditional split the was able to bring the intial load of DWhouse from two day to 6 hrs....i tested everthing in devlopment and it seem to work fine. the package was designed by somebody else and am just talking over. its very hard to do when u dont have a proper docmentaion...
    thanks a lot


    kkkk
    Wednesday, May 13, 2009 2:27 AM