Databricks' pipelines give error on ADF however the jobs on Databricks are successful


  • Hi all,

    I have set up few pipelines on ADF that calls Databricks notebooks. They run well in general, however, sometimes I will see something like this on ADF

    errorCode": "3202",

    "message": "Failed to construct cluster settings due to service unavailability. Please retry later.",

    "failureType": "UserError",

    If I log onto Databricks page and go to Clusters => Job Clusters, I can see that the notebook's Status is Succeeded. Therefore, I don't understand why in ADF I get this error but in the Databricks I can see it successful. Is this an error from ADF? Any insight would be appreciated.



    Monday, August 20, 2018 2:15 PM

All replies

  • To isolate the error, can you please let us know how are the failed jobs different from successful jobs? were you able to derive some insights? 
    Thursday, August 23, 2018 9:10 PM
  • Hi,

    Maybe I did not explain myself clearly. The jobs appeared as failed on Data Factory, but when I check on DataBricks these jobs are marked as Successful, and  all the cells in the notebook ran fine. Therefore, my guess is that there is a moment when Data Factory disconnect from DataBricks and at the end Data Factory does not know that the DataBricks job started/finished.



    Tuesday, September 4, 2018 2:20 PM