locked
How to handle column names in Azure Data flow RRS feed

  • Question

  • Hi, I would like to know how can I handle my column names in Dataflows. for example if I have source defined as "Select * From $Tablename" where Tablename parameter is defined in Dataflow and then I add a Join block where I would like to join my Column A with another column coming from another source in Data flow, how can I do that??

    in visual

    Pipeline:

    Foreach activity ( getting table name) -----> MyDataflow activity

    MyDataflow:( dataflow has parameter defined as tablename getting table name from for each activity

    Source1 (a sql table with query select * from $tablename)---->                                                                                                 

                                                                                                    Join Activity    --->  Sink

    Source 2(another sql table defined with columns) -------->         

            Join Activity(I go for Left Outer Join  and it asks for left and right stream) in right stream I select column name from Source2 but in right stream nothing comes up    

    What am i doing wrong???

    Thanks                                  

       


    zzzSharePoint

    Friday, January 3, 2020 12:26 PM

Answers

  • As called out by mkro , we need to use the derived column and then try to join the two source ( I being dynamic , we pass the table name and column name on which we need to join to the dataflow ) .


    Thanks Himanshu

    Tuesday, January 7, 2020 10:28 PM

All replies

  • My sincere apologizes for the delay in response , I did tried to use the dynamic expression to create  the join , but its not working . I have asked for help to the internal team and i am expecting reply tmmrw morning . I will update the thread once i hear from you . 

    On a other did the incremental pull worked or are you still struggling with that ? Let me know and i will try to work on that also with you .

     

    Thanks Himanshu

    Tuesday, January 7, 2020 6:37 AM
  • Thank you finally for coming back on this. The incremental pull with Stored procedure worked and its fine now. The data flow would have been an ideal scenario to go forward with incremental upload. Waiting for the internal team response.

    Thanks

    zzzSharePoint

    Tuesday, January 7, 2020 11:14 AM
  • You can use dynamic column names for Joins in ADF Data Flows. If you pass in the string name of a column as a parameter, you need to first lookup that column name using a Derived Column.

    Add a Derived Column before your Join and create a new name for the column, i.e. "myJoinCol". For the expression, enter: byName($colParam). colParam should be the string name of a column that you wish to use in the Join passed into the data flow from the pipeline as a parameter.

    Now you can create your Join using myJoinCol.

    Tuesday, January 7, 2020 9:37 PM
  • As called out by mkro , we need to use the derived column and then try to join the two source ( I being dynamic , we pass the table name and column name on which we need to join to the dataflow ) .


    Thanks Himanshu

    Tuesday, January 7, 2020 10:28 PM
  • Hi Himanshu, did exactly as said above but now recieving error as below





    message": "{\"StatusCode\":\"DFExecutorUserError\",\"Message\":\"Job '2f1ef016-9a2c-414f-b94f-cf7977b94eaa failed due to reason: DF-SYS-01 at Source 'SqlStagingData': com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'table_name'.\\ncom.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'table_name'.\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:258)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1535)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:467)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:409)\\n\\tat com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7151)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2478)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:219)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:199)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeQuery(SQLServerPreparedStatement.java:331)\\n\\tat org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:61)\\n\\tat org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:210)\\n\\tat org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:35)\\n\\tat org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:343)\\n\\tat org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:307)\\n\\tat org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:293)\\n\\tat org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:203)\\n\\tat org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:322)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCReader$$anonfun$read$5.apply(JDBCStore.scala:59)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCReader$$anonfun$read$5.apply(JDBCStore.scala:55)\\n\\tat scala.util.Try$.apply(Try.scala:192)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCReader.read(JDBCStore.scala:55)\\n\\tat com.microsoft.dataflow.transformers.TableLikeReader.read(TableStore.scala:82)\\n\\tat com.microsoft.dataflow.transformers.SourcePlanner$.physicalPlanInternal(SourceV2.scala:277)\\n\\tat com.microsoft.dataflow.transformers.SourcePlanner$.physicalPlan(SourceV2.scala:260)\\n\\tat com.microsoft.dataflow.FlowRunner$$anonfun$16.apply(FlowRunner.scala:230)\\n\\tat com.microsoft.dataflow.FlowRunner$$anonfun$16.apply(FlowRunner.scala:216)\\n\\tat scala.collection.Iterator$$anon$11.next(Iterator.scala:410)\\n\\tat scala.collection.TraversableOnce$class.collectFirst(TraversableOnce.scala:145)\\n\\tat scala.collection.SeqViewLike$AbstractTransformed.collectFirst(SeqViewLike.scala:37)\\n\\tat com.microsoft.dataflow.FlowRunner$.com$microsoft$dataflow$FlowRunner$$runner(FlowRunner.scala:309)\\n\\tat com.microsoft.dataflow.FlowRunner$$anonfun$runner$2.apply(FlowRunner.scala:178)\\n\\tat com.microsoft.dataflow.FlowRunner$$anonfun$runner$2.apply(FlowRunner.scala:173)\\n\\tat scala.util.Success.flatMap(Try.scala:231)\\n\\tat com.microsoft.dataflow.FlowRunner$.runner(FlowRunner.scala:173)\\n\\tat com.microsoft.dataflow.DataflowExecutor$$anonfun$6$$anonfun$apply$3$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6$$anonfun$apply$9$$anonfun$apply$10$$anonfun$apply$11$$anonfun$7.apply(DataflowExecutor.scala:119)\\n\\tat com.microsoft.dataflow.DataflowExecutor$$anonfun$6$$anonfun$apply$3$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6$$anonfun$apply$9$$anonfun$apply$10$$anonfun$apply$11$$anonfun$7.apply(DataflowExecutor.scala:106)\\n\\tat com.microsoft.dataflow.DataflowJobFuture$$anonfun$flowCode$1.apply(DataflowJobFuture.scala:71)\\n\\tat com.microsoft.dataflow.DataflowJobFuture$$anonfun$flowCode$1.apply(DataflowJobFuture.scala:71)\\n\\tat scala.Option.map(Option.scala:146)\\n\\tat com.microsoft.dataflow.DataflowJobFuture.flowCode$lzycompute(DataflowJobFuture.scala:71)\\n\\tat com.microsoft.dataflow.DataflowJobFuture.flowCode(DataflowJobFuture.scala:71)\\n\\tat com.microsoft.dataflow.DataflowJobFuture.addListener(DataflowJobFuture.scala:420)\\n\\tat com.microsoft.datafactory.dataflow.AppManager$.processActivityRunRequest(AppManager.scala:99)\\n\\tat com.microsoft.datafactory.dataflow.AppManager$.main(AppManager.scala:52)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command--1:1)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw$$iw$$iw$$iw$$iw.<init>(command--1:44)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw$$iw$$iw$$iw.<init>(command--1:46)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw$$iw$$iw.<init>(command--1:48)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw$$iw.<init>(command--1:50)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw.<init>(command--1:52)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read.<init>(command--1:54)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$.<init>(command--1:58)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$.<clinit>(command--1)\\n\\tat line6177713df9ab462ab679252689ebba0525.$eval$.$print$lzycompute(<notebook>:7)\\n\\tat line6177713df9ab462ab679252689ebba0525.$eval$.$print(<notebook>:6)\\n\\tat line6177713df9ab462ab679252689ebba0525.$eval.$print(<notebook>)\\n\\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\\n\\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\\n\\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\\n\\tat java.lang.reflect.Method.invoke(Method.java:498)\\n\\tat scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)\\n\\tat scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)\\n\\tat scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)\\n\\tat scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)\\n\\tat scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)\\n\\tat scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)\\n\\tat scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)\\n\\tat scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)\\n\\tat scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)\\n\\tat com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)\\n\\tat com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:197)\\n\\tat com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:197)\\n\\tat com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:197)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:679)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:632)\\n\\tat com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:197)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:368)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:345)\\n\\tat com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)\\n\\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)\\n\\tat com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:48)\\n\\tat com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:271)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:48)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:345)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)\\n\\tat scala.util.Try$.apply(Try.scala:192)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)\\n\\tat java.lang.Thread.run(Thread.java:748)\\n\",\"Details\":\"Job '2f1ef016-9a2c-414f-b94f-cf7977b94eaa failed due to reason: DF-SYS-01 at Source 'SqlStagingData': com.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'table_name'.\\ncom.microsoft.sqlserver.jdbc.SQLServerException: Invalid object name 'table_name'.\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerException.makeFromDatabaseError(SQLServerException.java:258)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerStatement.getNextResult(SQLServerStatement.java:1535)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.doExecutePreparedStatement(SQLServerPreparedStatement.java:467)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement$PrepStmtExecCmd.doExecute(SQLServerPreparedStatement.java:409)\\n\\tat com.microsoft.sqlserver.jdbc.TDSCommand.execute(IOBuffer.java:7151)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerConnection.executeCommand(SQLServerConnection.java:2478)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerStatement.executeCommand(SQLServerStatement.java:219)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerStatement.executeStatement(SQLServerStatement.java:199)\\n\\tat com.microsoft.sqlserver.jdbc.SQLServerPreparedStatement.executeQuery(SQLServerPreparedStatement.java:331)\\n\\tat org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:61)\\n\\tat org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation$.getSchema(JDBCRelation.scala:210)\\n\\tat org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:35)\\n\\tat org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:343)\\n\\tat org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:307)\\n\\tat org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:293)\\n\\tat org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:203)\\n\\tat org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:322)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCReader$$anonfun$read$5.apply(JDBCStore.scala:59)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCReader$$anonfun$read$5.apply(JDBCStore.scala:55)\\n\\tat scala.util.Try$.apply(Try.scala:192)\\n\\tat com.microsoft.dataflow.transformers.store.JDBCReader.read(JDBCStore.scala:55)\\n\\tat com.microsoft.dataflow.transformers.TableLikeReader.read(TableStore.scala:82)\\n\\tat com.microsoft.dataflow.transformers.SourcePlanner$.physicalPlanInternal(SourceV2.scala:277)\\n\\tat com.microsoft.dataflow.transformers.SourcePlanner$.physicalPlan(SourceV2.scala:260)\\n\\tat com.microsoft.dataflow.FlowRunner$$anonfun$16.apply(FlowRunner.scala:230)\\n\\tat com.microsoft.dataflow.FlowRunner$$anonfun$16.apply(FlowRunner.scala:216)\\n\\tat scala.collection.Iterator$$anon$11.next(Iterator.scala:410)\\n\\tat scala.collection.TraversableOnce$class.collectFirst(TraversableOnce.scala:145)\\n\\tat scala.collection.SeqViewLike$AbstractTransformed.collectFirst(SeqViewLike.scala:37)\\n\\tat com.microsoft.dataflow.FlowRunner$.com$microsoft$dataflow$FlowRunner$$runner(FlowRunner.scala:309)\\n\\tat com.microsoft.dataflow.FlowRunner$$anonfun$runner$2.apply(FlowRunner.scala:178)\\n\\tat com.microsoft.dataflow.FlowRunner$$anonfun$runner$2.apply(FlowRunner.scala:173)\\n\\tat scala.util.Success.flatMap(Try.scala:231)\\n\\tat com.microsoft.dataflow.FlowRunner$.runner(FlowRunner.scala:173)\\n\\tat com.microsoft.dataflow.DataflowExecutor$$anonfun$6$$anonfun$apply$3$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6$$anonfun$apply$9$$anonfun$apply$10$$anonfun$apply$11$$anonfun$7.apply(DataflowExecutor.scala:119)\\n\\tat com.microsoft.dataflow.DataflowExecutor$$anonfun$6$$anonfun$apply$3$$anonfun$apply$4$$anonfun$apply$5$$anonfun$apply$6$$anonfun$apply$9$$anonfun$apply$10$$anonfun$apply$11$$anonfun$7.apply(DataflowExecutor.scala:106)\\n\\tat com.microsoft.dataflow.DataflowJobFuture$$anonfun$flowCode$1.apply(DataflowJobFuture.scala:71)\\n\\tat com.microsoft.dataflow.DataflowJobFuture$$anonfun$flowCode$1.apply(DataflowJobFuture.scala:71)\\n\\tat scala.Option.map(Option.scala:146)\\n\\tat com.microsoft.dataflow.DataflowJobFuture.flowCode$lzycompute(DataflowJobFuture.scala:71)\\n\\tat com.microsoft.dataflow.DataflowJobFuture.flowCode(DataflowJobFuture.scala:71)\\n\\tat com.microsoft.dataflow.DataflowJobFuture.addListener(DataflowJobFuture.scala:420)\\n\\tat com.microsoft.datafactory.dataflow.AppManager$.processActivityRunRequest(AppManager.scala:99)\\n\\tat com.microsoft.datafactory.dataflow.AppManager$.main(AppManager.scala:52)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw$$iw$$iw$$iw$$iw$$iw.<init>(command--1:1)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw$$iw$$iw$$iw$$iw.<init>(command--1:44)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw$$iw$$iw$$iw.<init>(command--1:46)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw$$iw$$iw.<init>(command--1:48)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw$$iw.<init>(command--1:50)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$$iw.<init>(command--1:52)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read.<init>(command--1:54)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$.<init>(command--1:58)\\n\\tat line6177713df9ab462ab679252689ebba0525.$read$.<clinit>(command--1)\\n\\tat line6177713df9ab462ab679252689ebba0525.$eval$.$print$lzycompute(<notebook>:7)\\n\\tat line6177713df9ab462ab679252689ebba0525.$eval$.$print(<notebook>:6)\\n\\tat line6177713df9ab462ab679252689ebba0525.$eval.$print(<notebook>)\\n\\tat sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)\\n\\tat sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)\\n\\tat sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)\\n\\tat java.lang.reflect.Method.invoke(Method.java:498)\\n\\tat scala.tools.nsc.interpreter.IMain$ReadEvalPrint.call(IMain.scala:793)\\n\\tat scala.tools.nsc.interpreter.IMain$Request.loadAndRun(IMain.scala:1054)\\n\\tat scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:645)\\n\\tat scala.tools.nsc.interpreter.IMain$WrappedRequest$$anonfun$loadAndRunReq$1.apply(IMain.scala:644)\\n\\tat scala.reflect.internal.util.ScalaClassLoader$class.asContext(ScalaClassLoader.scala:31)\\n\\tat scala.reflect.internal.util.AbstractFileClassLoader.asContext(AbstractFileClassLoader.scala:19)\\n\\tat scala.tools.nsc.interpreter.IMain$WrappedRequest.loadAndRunReq(IMain.scala:644)\\n\\tat scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:576)\\n\\tat scala.tools.nsc.interpreter.IMain.interpret(IMain.scala:572)\\n\\tat com.databricks.backend.daemon.driver.DriverILoop.execute(DriverILoop.scala:215)\\n\\tat com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply$mcV$sp(ScalaDriverLocal.scala:197)\\n\\tat com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:197)\\n\\tat com.databricks.backend.daemon.driver.ScalaDriverLocal$$anonfun$repl$1.apply(ScalaDriverLocal.scala:197)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal$TrapExitInternal$.trapExit(DriverLocal.scala:679)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal$TrapExit$.apply(DriverLocal.scala:632)\\n\\tat com.databricks.backend.daemon.driver.ScalaDriverLocal.repl(ScalaDriverLocal.scala:197)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:368)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal$$anonfun$execute$8.apply(DriverLocal.scala:345)\\n\\tat com.databricks.logging.UsageLogging$$anonfun$withAttributionContext$1.apply(UsageLogging.scala:238)\\n\\tat scala.util.DynamicVariable.withValue(DynamicVariable.scala:58)\\n\\tat com.databricks.logging.UsageLogging$class.withAttributionContext(UsageLogging.scala:233)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal.withAttributionContext(DriverLocal.scala:48)\\n\\tat com.databricks.logging.UsageLogging$class.withAttributionTags(UsageLogging.scala:271)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal.withAttributionTags(DriverLocal.scala:48)\\n\\tat com.databricks.backend.daemon.driver.DriverLocal.execute(DriverLocal.scala:345)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper$$anonfun$tryExecutingCommand$2.apply(DriverWrapper.scala:644)\\n\\tat scala.util.Try$.apply(Try.scala:192)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.tryExecutingCommand(DriverWrapper.scala:639)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.getCommandOutputAndError(DriverWrapper.scala:485)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.executeCommand(DriverWrapper.scala:597)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.runInnerLoop(DriverWrapper.scala:390)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.runInner(DriverWrapper.scala:337)\\n\\tat com.databricks.backend.daemon.driver.DriverWrapper.run(DriverWrapper.scala:219)\\n\\tat java.lang.Thread.run(Thread.java:748)\\n\"}", "failureType": "UserError", "target": "dataflow1" }

    zzzSharePoint

    Wednesday, January 8, 2020 6:52 PM
  • Good that at the least have one less error to worry about :) 

    It is complaining about the "invalid object name 'table_name'" .

    Few things to look for

    1. Do you have that table name at Source/Sink .

    2. Is there any place where you are passing that name to any of the variables ?



    Thanks Himanshu

    Wednesday, January 8, 2020 7:48 PM
  • here is where my table_name is coming in picture:


    Now my For each activity has output as:

    Inside For Each I have my Dataflow activity defined with following parameters and settings:


    Source and Sink :

    Now my Derived activity:

    and here goes my Join and Filter activity:

    And then I have Select and Sink Defined where i dont do anything. So any suggesstion what should I modify??


    zzzSharePoint

    Wednesday, January 8, 2020 11:03 PM
  • Finally changed mytablename to item().table_name and it worked:). Thanks for all the help.

    zzzSharePoint

    Wednesday, January 8, 2020 11:39 PM