Home > Net >  Spark SQL How to Allow Some Tasks To Fail but Overall Job Still Succeed?
Spark SQL How to Allow Some Tasks To Fail but Overall Job Still Succeed?

Time:05-26

I have a Spark job where a small minority of the tasks keep failing, causing the whole job to then fail, and nothing gets outputted to the table where results are supposed to go. Is there a way to get Spark to tolerate a few failed tasks and still write the output from the successful ones? I don't actually need 100% of the data to get through, so I'm fine with a few tasks failing.

CodePudding user response:

No, that is not possible, and not part of the design of Spark. No is also an answer.

  • Related