locked
dynamically selecting tables in OLEDB Source RRS feed

  • Question


  • hello guys,

    I have 10 tables, table1, table2, table3, table4.......table10.  all these tables have different structure.
    From each of these tables I want extract data and dump into flat file csv.

    So i have OLEDB source and FlatFile Destination.

    If i write seperate data flow task for each of the tables, then there is no issue.

    But i want to use a single data flow task for all these tables. So for this, i use a variable @SQLStr . And i dynamically set the value of this variable to select * from table1, select * from table2.........selct * from table10.

    So in the OLEDB source I select Data Access Mode as : SQL Command from variable. And I use @SQLStr for this varible.

    For Destination, i dynamically generate the flat file for each of the table.

    But this doesnt work, i get validation errors.

    So, first can i use a single data flow task to dynamically change the source(different source tables) and destination. If so, what i am missing in the above process?

    any help appriciated.

    Thanks






    Monday, March 17, 2008 8:16 PM

Answers

  • This won't work.  The metadata is fixed at design time and cannot change during execution.

     

    You'll need to build ten data flows or use .net scripting entirely to do this.

     

    Monday, March 17, 2008 8:24 PM
  •  RJDBA wrote:

    hello guys,

    I have 10 tables, table1, table2, table3, table4.......table10.  all these tables have different structure.
    From each of these tables I want extract data and dump into flat file csv.

    <snip>

    So, first can i use a single data flow task to dynamically change the source(different source tables) and destination. If so, what i am missing in the above process?

    any help appriciated.

     

    Unfortunately, SSIS doesn't support this scenario. The data flow metadata must be set before the package executes and cannot change at runtime.

     

    The simplest solution is for you to use the Import/Export Wizard to build 10 different data flows and then edit the package in Visual Studio as necessary.

    Monday, March 17, 2008 8:31 PM
  • One way that you can work around the metadata limitation is to make sure the metadata stays the same. Instead of using "SELECT * FROM table1", use "SELECT columnA + ', ' + columnB AS formatted_row FROM table1". This effectively means you return all the columns as a single column delimited column. Then you can send that to a flat file destination.

     

    Of course, this means writing more SQL, and casting all your columns to strings so that they concatenate properly. If you store the SQL statements in a table, you can use that as the source for a For Each loop.

     

    Just a slightly different way to approach it.

     

    Tuesday, March 18, 2008 1:09 AM

All replies

  • This won't work.  The metadata is fixed at design time and cannot change during execution.

     

    You'll need to build ten data flows or use .net scripting entirely to do this.

     

    Monday, March 17, 2008 8:24 PM
  •  RJDBA wrote:

    hello guys,

    I have 10 tables, table1, table2, table3, table4.......table10.  all these tables have different structure.
    From each of these tables I want extract data and dump into flat file csv.

    <snip>

    So, first can i use a single data flow task to dynamically change the source(different source tables) and destination. If so, what i am missing in the above process?

    any help appriciated.

     

    Unfortunately, SSIS doesn't support this scenario. The data flow metadata must be set before the package executes and cannot change at runtime.

     

    The simplest solution is for you to use the Import/Export Wizard to build 10 different data flows and then edit the package in Visual Studio as necessary.

    Monday, March 17, 2008 8:31 PM
  • One way that you can work around the metadata limitation is to make sure the metadata stays the same. Instead of using "SELECT * FROM table1", use "SELECT columnA + ', ' + columnB AS formatted_row FROM table1". This effectively means you return all the columns as a single column delimited column. Then you can send that to a flat file destination.

     

    Of course, this means writing more SQL, and casting all your columns to strings so that they concatenate properly. If you store the SQL statements in a table, you can use that as the source for a For Each loop.

     

    Just a slightly different way to approach it.

     

    Tuesday, March 18, 2008 1:09 AM
  • Thank you all for replying.

    So since we can not do this in a single data flow task, I am trying to with seperate data flow task for each source table. But i want to do this as below-


    So lets say i have 3 tables TableA, TableB, TableC  from which i need to export data. So i create a table (TableList) where I save these table names and a unique id to these tables. e.g.

    TableList will have-
    TableName TableID
    TableA         1
    TableB         2
    TableC        3


    in the ssis package  select these tableNames & Ids from tableList in Execute SQL Task. And assign the result set to a variable object (@TableList.

    Then i use For Each Loop Container (For Each ADO Enumerator) , to loop through these tablesnames & iDs

    Inside this loop container, i define three data flow tasks one for each table. So i have DataFlowTaskA (For TableA), DataFlowTaskB(For TableB), DataFlowTaskC (For TableC).

    Now for a given table selected in the iteration, only the corresponding DataFlow Task should be exeuted. e.g. For the 1st iteration, if TableA is selected then only DataFlowTaskA should be executed and DataFlowTaskB& C should be skipped.

    In order to achieve this, I am using a 3 variable @FlagA, @FlagB, @FlagC  (type Boolean) one for each Table. and use the value of these flags for the "Disable" property of the data flow task (so @FlagA will be used for Disable property in the Expression for Data FlowTaskA, and so on..)

    SotThe First Step inside the Loop, I use Script Task. (Input for the script task: e read variable is @TableID and Read/Write varaibles are these 3 flags)

    In this script task, I initialize these flags to true or false appropriately. So this is what i do


            If (Dts.Variables("TableID").Value.ToString = "1") Then
                Dts.Variables("@FlagA").Value = False
            Else
                Dts.Variables("@FlagA").Value = True

            End If


            If (Dts.Variables("TableID").Value.ToString = "2") Then
                Dts.Variables("@FlagB").Value = False
            Else
                Dts.Variables("@FlagB").Value = True

            End If


    So in the 1st iteration, (if TableA comes) @FlagA=False and B&C will be True.
    So the Disable property for DataFlowTask will be set false and for others it will be set to True.  Thus, only DataFlowTaskA will be executed.

    And this action should be repeated for each input table.


    So this is my logic.

     However only for the 1st iteration(say TableA is selected) it behaves as above. i.e. DataFlowTaskA is executed and DataFlowTaskB & C are skipped. But in the 2nd iteration(say TableB is selected) , it again executes DataFlowTaskA and doesnt exeute B & C (where it should have executed B & skipped A&C).

    I do set daelay validation to true for all these but it still it doesnt working as expected. Even I checked the values for all the flags for each iteration and they seem to get the correct values. But somehow Diable propery in the expression not behaving as it should.

    Am i missing anything. Do i need to set any other property to make this work.


    I apprecite any help.

    Thanks





     





    Tuesday, March 18, 2008 10:14 PM
  • I've responded to the new thread that you've opened for this second "Disabled property" question.

     

    It's generally considered a best practice to have just one question per thread, as this makes it easier for future users to find solutions to their problems. If you consider your original question answered, please mark the appropriate responses as the answer and/or as "helpful." Thanks!

    Wednesday, March 19, 2008 12:34 AM