none
SSIS Data Flow Pipeline Stops and success is displayed - yet missing 800,000 rows RRS feed

  • Question

  • Im using the most recent version of 2017 Professional with Oracle OLEDB drivers installed July 2018 with a very simple OLEDB connection to an Oracle view on the source side. Im sending directly to a clean and flat SQL Server table with no indexes, constraints, nothing.  There are no transformations or any other SSIS objects between the the source and destination, just pure pipeline with two OLEDB connections.  I have done no tuning and there are no parameters being used.  I want the entire view to populate the SQL Server table.  I am getting 130,000 rows of the 1.1 million rows of the source Oracle view and it completes with no errors except a shared memory error that is there even when I limit the row count at the view to 500 rows (so I think that error message is an SSIS bug https://social.msdn.microsoft.com/Forums/sqlserver/en-US/2f2d9883-59ba-4356-ad68-05e67537e749/ssispipeline-warning-warning-could-not-open-global-shared-memory-to-communicate-with?forum=sqlintegrationservices#2f2d9883-59ba-4356-ad68-05e67537e749  ).

    Things tried:

    We expanded table space on the destination database - resulting in same row count at point of completion

    Things not tried:

    Tuning buffers since the speed was not an issue and 130k rows made it fine

    Monitor pipeline errors

    My questions are this:

    1. What could cause this view and another like it (that is about 800 columns wide) to stop and not report any error that interrupted the data flow on the "Execution Results"?
    2. With ONLY the Source and Destination OLEDB connectons tied to each other, can I FORCE it to resume following a data error instead of stopping at the point of error (if that is what it is doing)?
    3. Is error monitoring a requirement in order to FORCE a RESUME of the package processing when an error occurs?

    Thanks in advance for your help with this.


    Les Draper





    • Edited by Les Draper Thursday, July 26, 2018 2:11 PM
    Thursday, July 26, 2018 1:53 PM

All replies

  • Hi Les Draper,

    You first need to determine which component caused this issue that missing records (source or destination).

    You can use RowNumber() function with Execute SQL Task to check if it could return correct number of rows.

    Another possible reason for missing records is related to date format, please have a look at:

    Resolving Missing Records in SSIS from Oracle Source

    If you don't need to any transformation between oracle and SQL Server. You can also try to use SSMA.

    Getting Started with SSMA for Oracle (OracleToSQL)

    Regards,

    Pirlo Zhang 


    MSDN Community Support
    Please remember to click "Mark as Answer" the responses that resolved your issue, and to click "Unmark as Answer" if not. This can be beneficial to other community members reading this thread. If you have any compliments or complaints to MSDN Support, feel free to contact MSDNFSF@microsoft.com.

    Friday, July 27, 2018 9:46 AM
    Moderator
  • This line of code

    UseSessionFormat=True;

    was added through Extended Properties on the connection manager for the Oracle version of the OLEDB connector and there was no change to the limitation of rows.

    Data Source=TEST1;User ID=BANINST1;Provider=OraOLEDB.Oracle.1;Extended Properties="UseSessionFormat=True;";

    I tried my best to put that value into that string straight but the Connection Manager kept removing it but keeps this format.  Neither way had any effect on the result number of rows.

      The source completed first suggesting to me that is has no more rows to process (when it clearly does).  The souce has 4.4 million rows in it and none should be inhibited.  I pumped all 4.4 million rows from the view into a table and reconnected that as my source.

    This is a bug.  The source is cutting off prematurely at the same place with no error reporting.

    I have both the latest Oracle drivers and version if SSIS as of Aug 27, 2018.  The target table was created on SQL Server and 900,000 rows do land in the SQL Server destination table.

    I never did get the table to come across to SQL Server.  This is a new table that is having the same issue and I found no workaround for that one either.  Why does SSIS have this problem?


    Les Draper

    • Proposed as answer by ArthurZModerator Friday, March 15, 2019 6:59 PM
    • Unproposed as answer by Les Draper Saturday, March 16, 2019 3:17 AM
    Monday, August 27, 2018 8:22 PM
  • Hi Les Draper,

    Is the problem still open? I have the same problem and I have not found a solution


    Friday, March 15, 2019 2:27 PM