locked
sql server --ssis performance issues RRS feed

  • Question

  • There is a sql job (ssis) on sql 2012/SP2  that job imports the data into table(around 1 million rows )  and halts when only few records were left(200/300 records). The session goes into 'sleeping" mode with no wait states. Happened multiple times. We checked the same using the "select count(*) from **** " query. No progress observed for hours in the rowcount once the sessions goes into sleeping mode. Re-running resolves the issue. I am looking for root cause.

    We observed below
    a) No blocking
    b) Stats/index on the tables involved --Updated last night
    c) Space on tempdb --available
    d) Space on user db drives- available

    f) Space on OS drives --available

    e) Other parallel queries /jobs-- found some but they completes after some time and even after their completions , the jobs stays in the sleeping state.
    f) The job is using ssis to import the data with some ssis components involved like temp objects/multicast/cache objects not sure if that could  be the issue.

    g) The package was recently upgdared from sql 2008 r2  to sql 2012 . I have checked the ssis known issues as well , found nothing relevant.

    What  should I check and investigate next to find root cause. Any suggestions.

    Thursday, January 5, 2017 4:33 PM

All replies

  • How did the SSIS package perform prior to the upgrade to 2012? Has the package ever performed good? What, if anything was changed? Was there any change in code to take advantage of any 2012 features/enhancements, or was package just upgraded as-is by opening in 2012 & saving?

    Is the database recovery model set to FULL or SIMPLE? FULL recovery mode will have verbose logging that can add a lot of overhead to such large data-loads. Consider switching recovery model to SIMPLE.

    What Logging level is set for the package? If it's Verbose, consider switching to 'None' or 'Performance', to minimize logging overhead.

    You might want to trying breaking up the processing into smaller 'chunks' of data with COMMIT TRAN, to avoid logging exhaustion. Here is an example:

    DECLARE @rowid int
    SET @rowid = 100 -- start row
    WHILE @rowid < 8000000 -- end row
    BEGIN
        BEGIN TRAN
            DELETE    [tablename]
            WHERE    id <= @rowid
    
            IF @@Error <> 0
                ROLLBACK
            ELSE
                COMMIT
            SET @rowid = @rowid + 100000 -- commit tran every 100,000 rows
    END -- Loop

    Also, check out some SSIS Performance resources:
    The SSIS tuning tip that everyone misses
    Top 10 SQL Server Integration Services Best Practices
    Steps to Improve ETL performance using SSIS

    Hope that helps,


    Phil Streiff, MCDBA, MCITP, MCSA

    • Edited by philfactor Friday, January 6, 2017 5:49 PM
    Friday, January 6, 2017 3:56 AM
  • Thank You ,Phil. Will try out this.
    Friday, January 6, 2017 5:46 PM
  • Hi Mohammad,

    I will probably will go with bulk insert with minimal logging which will faster for the upload.

    https://msdn.microsoft.com/en-AU/library/ms175915.aspx

    Regards,

    kccrga


    -kccrga http://dbatrend.blogspot.com.au/

    Tuesday, January 10, 2017 11:06 PM
  • Hi,
    Try and Configure Data access mode option in OLEDB Destination.
    In the SSIS data flow task we can find the OLEDB destination, which provides a couple of options to push data into the destination table, 
    under the Data access mode;
    1. “Table or view“ option, which inserts one row at a time;
    2. “Table or view fast load” option, which internally uses the bulk insert statement to send data into the destination table, which always provides better performance compared to other options.
    Once you choose the “fast load” option it gives you more control to manage the destination table behavior during a data push operation, like Keep identity, Keep nulls, Table lock and Check constraints.

    Cheers,
    Sunit

    Wednesday, January 11, 2017 12:11 AM