locked
Dumb question RRS feed

  • Question

  • I have a simple flow that loads a data table from some flat files.  It works properly but I can't figure out how to add only rows that exist (so I won't get an error from the duplicate ID).  I added a lookup that redirects records that don't match any ID, but when I run it I get a timeout error (?).  It seems to pick up the right # of records to add, but when it gets to the SQL Server Destination it seems to generate a timeout.



    SSIS package "ImportAL3.dtsx" starting.
    Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
    Warning: 0x802092A7 at Data Flow Task, SQL Server Destination [872]: Truncation may occur due to inserting data from data flow column "SampleID" with a length of 4000 to database column "SampleID" with a length of 10.
    Warning: 0x800470D8 at Data Flow Task, Derived Column [1446]: The result string for expression "TRIM([Column 17]) + REPLICATE(" ",10 - LEN(TRIM([Column 17])))" may be truncated if it exceeds the maximum length of 4000 characters. The expression could have a result value that exceeds the maximum size of a DT_WSTR.
    Warning: 0x800470D8 at Data Flow Task, Derived Column [1446]: The result string for expression "TRIM([Column 2]) + REPLICATE(" ",25 - LEN(TRIM([Column 2])))" may be truncated if it exceeds the maximum length of 4000 characters. The expression could have a result value that exceeds the maximum size of a DT_WSTR.
    Information: 0x4004300A at Data Flow Task, DTS.Pipeline: Validation phase is beginning.
    Warning: 0x802092A7 at Data Flow Task, SQL Server Destination [872]: Truncation may occur due to inserting data from data flow column "SampleID" with a length of 4000 to database column "SampleID" with a length of 10.
    Warning: 0x800470D8 at Data Flow Task, Derived Column [1446]: The result string for expression "TRIM([Column 17]) + REPLICATE(" ",10 - LEN(TRIM([Column 17])))" may be truncated if it exceeds the maximum length of 4000 characters. The expression could have a result value that exceeds the maximum size of a DT_WSTR.
    Warning: 0x800470D8 at Data Flow Task, Derived Column [1446]: The result string for expression "TRIM([Column 2]) + REPLICATE(" ",25 - LEN(TRIM([Column 2])))" may be truncated if it exceeds the maximum length of 4000 characters. The expression could have a result value that exceeds the maximum size of a DT_WSTR.
    Information: 0x40043006 at Data Flow Task, DTS.Pipeline: Prepare for Execute phase is beginning.
    Information: 0x40043007 at Data Flow Task, DTS.Pipeline: Pre-Execute phase is beginning.
    Information: 0x402090DC at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002785.AL3" has started.
    Warning: 0x800470D8 at Data Flow Task, Derived Column [1446]: The result string for expression "TRIM([Column 17]) + REPLICATE(" ",10 - LEN(TRIM([Column 17])))" may be truncated if it exceeds the maximum length of 4000 characters. The expression could have a result value that exceeds the maximum size of a DT_WSTR.
    Warning: 0x800470D8 at Data Flow Task, Derived Column [1446]: The result string for expression "TRIM([Column 2]) + REPLICATE(" ",25 - LEN(TRIM([Column 2])))" may be truncated if it exceeds the maximum length of 4000 characters. The expression could have a result value that exceeds the maximum size of a DT_WSTR.
    Information: 0x400490F4 at Data Flow Task, LookupGrade [2832]: component "LookupGrade" (2832) has cached 11 rows.
    Information: 0x400490F4 at Data Flow Task, LookupTestID [5608]: component "LookupTestID" (5608) has cached 0 rows.
    Error: 0xC0202009 at Data Flow Task, SQL Server Destination [872]: An OLE DB error has occurred. Error code: 0x80040E14.
    An OLE DB record is available.  Source: "Microsoft SQL Native Client"  Hresult: 0x80040E14  Description: "Cannot fetch a row from OLE DB provider "BULK" for linked server "(null)".".
    An OLE DB record is available.  Source: "Microsoft SQL Native Client"  Hresult: 0x80040E14  Description: "The OLE DB provider "BULK" for linked server "(null)" reported an error. The provider did not give any information about the error.".
    An OLE DB record is available.  Source: "Microsoft SQL Native Client"  Hresult: 0x80040E14  Description: "Reading from DTS buffer timed out.".
    Information: 0x4004300C at Data Flow Task, DTS.Pipeline: Execute phase is beginning.
    Information: 0x402090DE at Data Flow Task, Flat File Source [100]: The total number of data rows processed for file "C:\temp\LW002785.AL3" is 1.
    Information: 0x402090DD at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002785.AL3" has ended.
    Information: 0x402090DC at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002786.AL3" has started.
    Information: 0x402090DE at Data Flow Task, Flat File Source [100]: The total number of data rows processed for file "C:\temp\LW002786.AL3" is 1.
    Information: 0x402090DD at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002786.AL3" has ended.
    Information: 0x402090DC at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002787.AL3" has started.
    Information: 0x402090DE at Data Flow Task, Flat File Source [100]: The total number of data rows processed for file "C:\temp\LW002787.AL3" is 1.
    Information: 0x402090DD at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002787.AL3" has ended.
    Information: 0x402090DC at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002788.AL3" has started.
    Information: 0x402090DE at Data Flow Task, Flat File Source [100]: The total number of data rows processed for file "C:\temp\LW002788.AL3" is 1.
    Information: 0x402090DD at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002788.AL3" has ended.
    Information: 0x402090DC at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002789.AL3" has started.
    Information: 0x402090DE at Data Flow Task, Flat File Source [100]: The total number of data rows processed for file "C:\temp\LW002789.AL3" is 1.
    Information: 0x402090DD at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002789.AL3" has ended.
    Information: 0x402090DC at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002790.AL3" has started.
    Information: 0x402090DE at Data Flow Task, Flat File Source [100]: The total number of data rows processed for file "C:\temp\LW002790.AL3" is 1.
    Information: 0x402090DD at Data Flow Task, Flat File Source [100]: The processing of file "C:\temp\LW002790.AL3" has ended.
    Information: 0x40043008 at Data Flow Task, DTS.Pipeline: Post Execute phase is beginning.
    Information: 0x40043009 at Data Flow Task, DTS.Pipeline: Cleanup phase is beginning.
    Information: 0x4004300B at Data Flow Task, DTS.Pipeline: "component "SQL Server Destination" (872)" wrote 6 rows.
    Warning: 0x80019002 at Data Flow Task: The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
    Task failed: Data Flow Task
    Warning: 0x80019002 at ImportAL3: The Execution method succeeded, but the number of errors raised (1) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors.
    SSIS package "ImportAL3.dtsx" finished: Failure.

    Tuesday, August 7, 2007 8:00 PM

Answers

  • You have quite a few errors there.

    You might want to replace the SQL Server destination with an OLE DB Destination.

    It seems that you have the right idea using the lookup transformation.  The red arrow will direct non-matches.
    Tuesday, August 7, 2007 8:14 PM

All replies

  • You have quite a few errors there.

    You might want to replace the SQL Server destination with an OLE DB Destination.

    It seems that you have the right idea using the lookup transformation.  The red arrow will direct non-matches.
    Tuesday, August 7, 2007 8:14 PM
  • Thanks for the help, that worked!

    Actually, I get just one error - the rest are just truncation warnings.  I don't see any need to address them - if the file has a >4,000 character key, it's been corrupted, and I want it to fail in that case!  The one error that I did get, is solved by using an OLE DB destination.   (I don't intend on scanning all files, every time, normally I should get no duplicates, but for the sake of error tolerance I had to put a check in.)

    Now I'm doing a bit of reading to see just why the SQL Server Destination fails in such a case - it emphasises BULK transfers, I'm not sure of the significance of this.

    Another dumb question, is there any way to post graphic files here?  Since coding is being replaced by a graphical interface, you can't really post the code here.
    Wednesday, August 8, 2007 12:42 PM
  • You need to have your SQL Server destination database running on the same machine that the package is stored on.  Also, a SQL Server destination won't perform any automatic type conversions -- you have to be sure that the metadata going into it is an exact match for the SQL Server table data types.
    Wednesday, August 8, 2007 2:03 PM
  • Well, destination database is on the same machine.  And when I went to the datatable, cleared out the records in question, and removed the SSIS module that checked for duplicate keys, and just went straight from the flat files into the database, it worked fine.

    Wednesday, August 8, 2007 2:06 PM