locked
Azure Data Factory copy activity passing the PipelineRunId to destination column RRS feed

  • Question

  • Hello,

    Im facing to technical issue on the copy activity of the data factory. I need to pass the PipelineRunId as a parameter to my copy activity to be concidered as a column for my destination table. Haw can I do it.

    Thank you for help

    Friday, October 11, 2019 9:31 AM

Answers

  • Dataflow can be used for source to sink scenarios also like this. It is misconception that it is limited to transformation only
    • Proposed as answer by dataflowuser Monday, October 14, 2019 4:05 PM
    • Marked as answer by BERKAYA Tuesday, October 15, 2019 12:05 PM
    Monday, October 14, 2019 4:05 PM

All replies

  • Hi BERKAYA,

    Thank you for your query. Could you please provide more details on your scenario? 

    • What are source & sink data stores of your copy activity?
    • How many columns does your source and sink data stores have?

    You requirement sounds to be achievable using Data Flow in Data Factory instead of Copy Activity. Would you be willing to use Data Flow instead of Copy Activity? 


    Thank you

    If a post helps to resolve your issue, please click the "Mark as Answer" of that post and/or click Answered "Vote as helpful" button of that post. By marking a post as Answered and/or Helpful, you help others find the answer faster.

    Saturday, October 12, 2019 12:00 AM
  • Hi<o:p></o:p>

    Thank you for your reply. In fact my source is a Json file and my destination is a Azure SQL table. I need to have the PipelineRunId as a column in my table to match witch pipeline execution got the Json file because I have many other Traitement steps to work on it. I know that is achievable with data flow, but In my case Im copping the data as it is with no transformation at this level but I need to trace the execution id of pipeline because a have a table that trace the filename source, the start date of the pipeline, the total record count…<o:p></o:p>

    Thanks<o:p></o:p>

    Monday, October 14, 2019 1:48 PM
  • Dataflow can be used for source to sink scenarios also like this. It is misconception that it is limited to transformation only
    • Proposed as answer by dataflowuser Monday, October 14, 2019 4:05 PM
    • Marked as answer by BERKAYA Tuesday, October 15, 2019 12:05 PM
    Monday, October 14, 2019 4:05 PM