Home > Back-end >  Azure Databricks Multi-task jobs and workflows. Simulate on completion status
Azure Databricks Multi-task jobs and workflows. Simulate on completion status

Time:10-14

Currently, we are investigating how to effectively incorporate databricks latest feature for orchestration of tasks - Multi-task Jobs.

The default behaviour is that a downstream task would not be executed if the previous one has failed for some reason.

So the question is: Is it currently possible to have an onComplete status (similar to those in Azure Data Factory or SQL Server Integration services-SSIS) that regardsless of the task success or failure we can continue with the workflow and execute the next tasks.

CodePudding user response:

Right now it's not possible, but you can just surround your code with try/catch, so if error is thrown then catch block won't propagate exception further, and task won't be marked as failed.

  • Related