locked
Create Folders in Azure Data Lake store as per Input file name RRS feed

  • Question

  • Hi All,

    I have input file named "EMP_20170720.csv". I need to copy this file from some SFTP to Azure Data lake Store from Data Factory. I am using Copy activity.

    So I want to create folder in Azure Data lake store in below hierarchy:

    EMP

         2017

             07

                20.

    Monday, August 21, 2017 9:16 AM

All replies

  • you can use the PartitionBy property of your input/output dataset to do so: https://docs.microsoft.com/en-us/azure/data-factory/data-factory-ftp-connector

    however, this relies on the SliceStart parameter and can only be done for the current day

    for the "EMP_" part I am afraid this cannot be done automatically and you would need to create one input and one output dataset for each file type.

    hth,
    -gerhard


    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Tuesday, August 22, 2017 9:26 AM
  • We can't use the Date Time Now, we need to generate the folder on ADLS as per datetime value from filename.
    Thursday, August 24, 2017 4:20 PM
  • I am afraid then this does not work with ADF out of the box

    -gerhard


    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Thursday, August 24, 2017 4:56 PM