Home > other >  The Spark how to Kill off the application
The Spark how to Kill off the application

Time:09-21

Project need to spark some of the tasks to monitor, some tasks backlog too much, to kill off,
Implore you instruct me how to do, kill off to submit application

CodePudding user response:

The spark rest client have to stop the job function

CodePudding user response:

I see some official document, is not to stop the job function, the spark has a hidden rest API, if you refer to is this?




================================& gt;>
The spark rest client have to stop the job function

CodePudding user response:

Spark there can directly kill command, specific commands you need to check it, anyway is some,

CodePudding user response:

reference shadon178 reply: 3/f
there is a command can directly kill the Spark, specific commands you need to check it, anyway is some,

Kill off the spark, I see the method of application, is to know driverid, this kind of method applies to through rest to submit task, we now in the engineering, are made with script to submit

CodePudding user response:

Yarn application - query list all tasks;
Then use the yarn application - kill & lt; AppId>

CodePudding user response:

A train of thought in the UI after watching the kill of requests find API call API function behind
I don't know at no

CodePudding user response:

/stages/stage/kill/? Id=1701

CodePudding user response:

Directly in your web client page can spark cluster, find the corresponding Application, click on the kill

CodePudding user response:

refer to the second floor Forrestleo response:
I read some official document, is not to stop the job function of the spark has a hidden rest API, if you refer to is this?




================================& gt;>
Spark rest client have to stop the job function



Spark has a hidden rest API why apache website can't find
  • Related