Home > OS >  Pass through exception in PySpark
Pass through exception in PySpark

Time:10-14

I am using a python workbook in databricks, that calls another workbook from another folder.

The first workbook should throw an exception if the second workbooks fails executing, but this does not happen. It only throws an exception, if the first workbooks fails (e.g. wrong path).

How can I pass the exception of the second workbook to the first one?

First workbook:

try:
  dbutils.notebook.run("../01_load/main_exec", 0)
except Exception as e:
  print("01 failed")

Second workbook:

try:
  dbutils.notebook.run("../02_wbs/step_exec", 0)
except Exception as e:
  print("02 failed")

If the first task fails, I want my first workbook to return "01 failed". But if the second workbook fails, I want my first workbook to return "02 failed"

CodePudding user response:

If you raise an exception on notebook 2, then dbutils.notebook.run("notebook_2", 0) will throw an error as well.

#notebook_2

raise Exception("Error")
#notebook_1 
try:
  dbutils.notebook.run("notebook_2", 0)
except Exception as e:
  raise Exception("Notebook 2 failed", e)
  • Related