Home > OS >  How can I get the jobId of current SparkContext?
How can I get the jobId of current SparkContext?

Time:11-13

All other questions seem to address getting of Spark applicationId. I want to cancel the spark job programmatically which requires jobId.

spark.sparkContext.cancelJob(jobId)

CodePudding user response:

Similar to the following way.

sc.applicationId

CodePudding user response:

You can use below code logic for this use case.

Step-01: getting the job details .

import requests
import json
class BearerAuth(requests.auth.AuthBase):
    def __init__(self, token):
        self.token = token
    def __call__(self, r):
        r.headers["authorization"] = "Bearer "   self.token
        return r
response = requests.get('https://databricksinstance/api/2.0/jobs/list', auth=BearerAuth('token')).json()
print(response)

Step-02: cancelling the job rest api call

same code , just change the URL as like this

https://<databricks-instance>/api/2.1/jobs/runs/cancel

ref: link

  • Related