locked
Data Flow Dynamic File Name - Recursively Process Files RRS feed

  • Question

  • Is it possible to iterate through sub-folders, so rather than processing hourly - we can process a days worth of data. Its straight forward to do in a "CopyData" function as there is an option to recursivley process files however in the data flow that option isn't available 

    I have tried the below however it doesn't read through the daily folders

    @concat('ping/',formatDateTime(utcnow(),'yyyy'),'/',formatDateTime(utcnow(),'MM'),'/',formatDateTime(utcnow(),'dd'))  

    Thanks

    Thursday, June 6, 2019 10:21 AM

All replies

  • Hi,

    You can use wildcard path, it will process all the files which match the pattern. But all the files should follow the same schema.

    For example, /**/movies.csv will match all the movies.csv file in the sub folders. To use wildcard path, you need to set the container correctly in the dataset. And set the wildcard path based on the relative path.

    Thanks

    Mingyang

    Friday, June 7, 2019 12:49 AM
  • Thanks, do all files have to have the same name ?

    I have tried  /**/*.csv however it falls over

    Thanks

    Monday, June 10, 2019 3:20 PM
  • Rippll, could you please tell me more about how it falls over?  Is there an error message or does something else happen?
    Monday, June 10, 2019 4:30 PM
  • Path /ping/2019/06/\*/\*/ does not resolve to any file(s). Please make sure the file/folder exists is not hidden.

    is the error.

    @concat('ping/',formatDateTime(utcnow(),'yyyy'),'/',formatDateTime(utcnow(),'MM'),'/*/*' ) 

    is the dynamic path

    the actual path I am trying to access is

    ping/2019/06/08/12/test.csv


    Thanks

    Tuesday, June 11, 2019 9:39 AM
  • Hi Rippll,

    You need to use wildcard path in DataFlow source settings. The dynamic path from dataset need to be the real path of one file.

    Thanks

    Mingyang

    Tuesday, June 11, 2019 3:43 PM