none
Importing data from flat files to SQL Server

    Question

  • I have a number of flat files (CSV format) that I need to load into SQL Server.  The tables and files have the same name (example:  filename = addresses.csv, table name = addresses).  I'm trying to find a way to cleanly load these files.  What I'd like to do is iterate over the files to be loaded and then import that file into the appropriate table.  I can easily extract the filename (without path and extension) in a script component and set variables to the table name and/or set a flat file connection's connection string to the input file.  I'm using the foreach-loop sequence container as my iterator for the files.  The problems occur when I try to dynamically alter the data flow.  The transform fails with error about failure to get the interface.  The bulk insert task fails with an error about being unable to open the file and the files are on the same machine where the package is being developed.

    Can I create transforms where I can alter the source and destination objects and let SSIS map input to output columns?

    Tuesday, August 10, 2010 2:08 PM

Answers

  • OK, I misunderstood your original Post.

    In this case, you have to create Connection Manager and Dataflow for every csv.

    The Destination is SQL Server? Then you need only one Connection Manager. In the Destination of he Dataflow choose the table to store the data in.

    • Marked as answer by Ira Davis Thursday, August 12, 2010 8:13 PM
    Tuesday, August 10, 2010 5:38 PM

All replies

  • You mustn't alter the Source and destination in DF, you have to alter the Connection Strings of the Connection Mangers. But this will only work, if the structure of source and destination ist always the same.
    Tuesday, August 10, 2010 2:14 PM
  • So I need to create a data flow for each table/file combination.  I can then set the connection string for the flat file to reference the location of the given file with a script component or via expressions but each flat file connection will be specific to a particular input file.  And each data flow will reference a particular table so the mappings between source and destination will remain constant.

    Thanks for your reponse.

    Tuesday, August 10, 2010 3:04 PM
  • No, you create one Dataflow.

    Create the needed Connection Manager.

    then put DF into a For Each Container. In For Each Container you will get the filename that you pass to your Connection manager as Connection String.

    In For Each Cointainer loop throught a table which contains all filenames with fullpath. Set the Path in Container using Expression.

    Tuesday, August 10, 2010 3:26 PM
  • Now I'm really confused.  Let me restate the problem to make sure we're talking about the same thing.

    I have multiple input files.  The schemas of the flat files vary from file to file.  I have a database with tables that are built to match the schemas of the flat files. 

    Can I somehow use a single data flow with flat file source, OLEDB destination, and straight input column to output column mapping via column name to load the tables from the flat files?  I've worked on this for months (on and off) and cannot get it to work.

    File A has 3 columns, int, varchar(10), datetime

    File B has 2 columns varchar(20), int

    Table A has 3 columns int, varchar(10), datetime

    Table B has 2 columns varchar(20), int.

    The flat files are in E:\Extract and each CSV file has a header row with column names.  The column names of the tables are the same (maybe different case) as the file column names.

    Can I use a single data flow to load file A into table B and then alter the properties of the data flow source and destination objects to then load file B into table B?

    Tuesday, August 10, 2010 5:11 PM
  • OK, I misunderstood your original Post.

    In this case, you have to create Connection Manager and Dataflow for every csv.

    The Destination is SQL Server? Then you need only one Connection Manager. In the Destination of he Dataflow choose the table to store the data in.

    • Marked as answer by Ira Davis Thursday, August 12, 2010 8:13 PM
    Tuesday, August 10, 2010 5:38 PM
  • Now I'm really confused.  Let me restate the problem to make sure we're talking about the same thing.

    I have multiple input files.  The schemas of the flat files vary from file to file.  I have a database with tables that are built to match the schemas of the flat files. 

    Can I somehow use a single data flow with flat file source, OLEDB destination, and straight input column to output column mapping via column name to load the tables from the flat files?  I've worked on this for months (on and off) and cannot get it to work.

    File A has 3 columns, int, varchar(10), datetime

    File B has 2 columns varchar(20), int

    Table A has 3 columns int, varchar(10), datetime

    Table B has 2 columns varchar(20), int.

    The flat files are in E:\Extract and each CSV file has a header row with column names.  The column names of the tables are the same (maybe different case) as the file column names.

    Can I use a single data flow to load file A into table B and then alter the properties of the data flow source and destination objects to then load file B into table B?

    NO....

    SSIS DFT doesn't support dynamic metedata..

    but in one DFT you can use 2 flat file source for File A and File B

    and map them according to the destination table..

     


    Let us TRY this |

    http://quest4gen.blogspot.com/
    Wednesday, August 11, 2010 12:27 PM
  • Thanks, all.  That's what I needed to know.
    Thursday, August 12, 2010 8:13 PM