locked
dynamic source and dynamic destination RRS feed

  • Question

  • I have a requirment which i have partly accomplished , but could not get through completely

     

    i have a file which comes in a standard format ending with date and seq number ,

     

    suppose , the file name is abc_yyyymmdd_01 , for first copy , if it is copied more then once the sequence number changes to 02 and 03 and keep going on .

     

    then i need to transform those in to new file comma delimited destination file with a name abc_yyyymmdd,txt and others counting file counting record abc_count_yyyymmdd.txt.  and move it to a designated folder. and the source file is then moved to archived folder

     

     

    what i have taken apprach is

     

    script task select source file --------------------> data flow task------------------------------------------> script task to destination file

     

    dataflow task -------------------------> does count and copy in delimited format

     

     

     

    what is happening here is  i can accomlish a regular source file convert it to delimited destination file --------> and move it to destination folder with script task .  

     

    but cannot  work the dynamic pick of a source file.

     

     

    please advise with your comments or solution you have

     

     

    Tuesday, April 15, 2008 5:23 PM

Answers

  • You can use a File System Task to move files from one folder to another. I think you'd want it inside the For Each loop.

     

    For the sequence number, you'd need to store it in a variable, and increment it using a script inside the loop.

     

    Wednesday, April 16, 2008 1:52 PM

All replies

  • Not sure that I understand your problem, but I'll try.

     

    You should be able to use an expression on the connectionstring property of the Flat File connection manager to have it update dynamically. It sounds like you need a For Each Loop task, that is set to loop through the files in the source directory. This task can set a variable to the name of the current file, and you can use the expression to update the connectionstring property with the value of this variable.

     

    Tuesday, April 15, 2008 5:49 PM
  • How would the control flow setup

     

    will i have script task and dataflow task out of for each look container or should the data flow task be with in the for each loop container.

     

    and how about script task to move the file , i am little confused over here. but if you could help me in depth , i woudl have some way of direction

     

    basically the scenario

     

    is i have a file which comes filename_date_01 and that file needs to picked up every day . does transformation and then sends to destination and the destination file is moved to a folder with a newname_dateprocessed.

     

     

     

     

     

     

     

    Tuesday, April 15, 2008 5:58 PM
  • The tasks that you want to repeat for each source file should be contained inside the For Each - so the data flow would certainly need to be inside, as well as the script task to move the file.

     

    If you only have a single file coming in, there is no need for the For Each. If you have multiple files to process, then you do need the For Each.

     

    Tuesday, April 15, 2008 6:06 PM
  • One more question while i am getting close to understanding

     

    since the source file is dynamic for each day, the properties of connection string  on source flat file connection will be able to access the variable and select the file based on the date .

     

    and the target which is dynamic , the script task should take care of it. but do i also need to include File system task or their would be no need of it , if not basically my package will be

     

    a for each loop container containing the data flow task and script task to move the folder .

     

    and dataflow task would include source and destination .txt file , where the properties of source expression would be set with to locate the dynamic source file.

     

    please advise if i got that right

     

    thank you

     

     

     

    Tuesday, April 15, 2008 6:20 PM
  • That sounds correct. If you already have the script task doing the file movement, there is no need to replace it with a File System task (unless you just want to eliminate the script).

     

    Tuesday, April 15, 2008 6:50 PM
  • I got the feel of it , so basically you will be defining package variables for script task and source flat file connection and that should get me going.

     

     

    thanks for the same

     

    Tuesday, April 15, 2008 6:59 PM
  • While we are on the subject , i just wanted to ask you about the sequence number. As i mentioned you about the sequence number that would be kind of extension to filename, just in case if their are more then one ftp download

    for example

     

    the filename which comes in ftp is A_20081504_01.txt, A_20081504_02.txt , how can i read the create the same dynamic destination file. i have reached where it creates the date but sequence number does not get generated it goes to only 0 but does not proceed further.

     

    And the other things is once all this process is done the orignal source file should be moved to archived folder , so do i create a script task or file task and should that be out of container or with in foreach container loop , as i mentioned i am using one script task already which is in for each loop container and dataflow task as well.

     

    please advise

    Wednesday, April 16, 2008 6:36 AM
  • You can use a File System Task to move files from one folder to another. I think you'd want it inside the For Each loop.

     

    For the sequence number, you'd need to store it in a variable, and increment it using a script inside the loop.

     

    Wednesday, April 16, 2008 1:52 PM
  •  

    the script task i have been trying to use incremental sequence number , but for some reason i could not make it resolve. Would it b possible to get some sample code snippet if possible or a instance that can help me in my situation.

     

    please advise

    Thursday, April 17, 2008 4:20 AM
  •  RSR001 wrote:

     

    the script task i have been trying to use incremental sequence number , but for some reason i could not make it resolve. Would it b possible to get some sample code snippet if possible or a instance that can help me in my situation.

     

    please advise

     

    Generating Surrogate Keys

    (http://www.sqlis.com/37.aspx)

     

    -Jamie

     

    Thursday, April 17, 2008 5:51 AM
  •  

    Jamie

     

    I am looking for a seq no in file name and not at record level .

     

    I have file name which comes in this form A_todaysdate_01,A_todaysdate_02 ,A_todaysdate_03 which generates how many times has the file come in a day. and would create a dynamic target with A_todaysdate_01 , 02 , 03 .

     

    my challenge is i can create a file name with A_todaysdate but i having trouble with creating those seq number 01, 02 , 03 which is based on the name of the source it comes in or if it exists in destination folder.

     

    please advise

     

    Thursday, April 17, 2008 1:57 PM
  • Actually i have failed to get the sequence number for file name , do you think is their any example or script i can refer to accomplish this ? please advise

     

    I have one more question , infact actually i have to generate another file with count of records but again with dynamic names. so would i be using a derived coloum or would text file be able to count the recrods and move them into a dynamic file name.  More importantly do i have to create a seperate script  task  to accomplish that or would the same script task be able to do it.

     

    Please help

    Saturday, April 19, 2008 2:12 PM
  • To create a sequence number, you need a variable in the package. You can use a script task to increment the variable as needed, with the following code:

     

    Code Snippet

    Dim vars As Variables

    Dts.VariableDispenser.LockOneForWrite("YourCounter", vars)

    vars("YourCounter").Value = vars("YourCounter").Value + 1

    vars.Unlock()

     

     

    To create a separate file with the record count, you could use a multicast in the data flow that connects to your normal processing and to an Aggregate transform that counts the rows. Then send that output to a second flat file destination. For some other ideas, you could see this post (http://agilebi.com/cs/blogs/jwelch/archive/2008/02/08/adding-headers-and-footers-to-flat-files.aspx). It's on a slightly different topic, but you could adapt the examples to product separate files.

    Tuesday, April 22, 2008 2:01 AM