SSIS - Flat File Import and duplicates


  • Hi,

    Here is the background.  I have an SSIS task configured to:
     - Copy all log files contained within a network share locally to the server.
     - Parse those text files and populate the DB

    The problem is, I need to find a way to prevent duplicate entries.  As it is configured, everytime the project is run it copies the entire directory over(including anything processed the last time it was run) and adds duplicate entries to the DB.  I need a way to have it only copy or add distinct entries based on criteria within the file (date and machine name) for example. 

    Any ideas?
    Wednesday, July 22, 2009 5:02 PM


  • Then you have to keep the location of the files , names , the file size KB and other properties into a table and each time when the SSIS want to run it compare the file with the data table that you have so that you can see that has the table been loaded or not
    but still i am sure there will be lots and lost of question rising for exaple
    what happends if they modify some records in a file?,
    what happends if moves to anotther location?
    what happends if some records are added only ?
    and etc....

    you have to ask for a bussiness rule before you make your SSIS

    anyways the way i handled that is that i reloded the record bu UpSerting then in the table
    that was the only choice that i had


    Wednesday, July 22, 2009 5:19 PM

All replies