Home > front end >  The resources applied by YARN are not active, Will other jobs use them?
The resources applied by YARN are not active, Will other jobs use them?

Time:09-22

I run a spark job allocat a lot of resource in yarn, and the job is for a long time.

In the last the task of the spark job is just active one core an two core.

I want to know the don't active resource be can use by other spark job or mr job.

Or just the first spark job completed, the resource just can be use by other job.

CodePudding user response:

Depends on your Queueing policy & the Scheduler specified for each queue.

I assume you just have a single default queue (root) in which all your jobs are being submitted. In that case, the default scheduler is a FIFO scheduler, which will submit a new job only once the earlier submitted job completes.

If that's not the case, you can check your queue & specified scheduler in etc/hadoop/capacity-scheduler.xml file.

More information on the 2 types of schedulers

https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/CapacityScheduler.html https://hadoop.apache.org/docs/current/hadoop-yarn/hadoop-yarn-site/FairScheduler.html

  • Related