Home > other >  Why do you want to use multiple executor in the Spark?
Why do you want to use multiple executor in the Spark?

Time:10-08

As title, yarn mode, put all the task in an executor is more convenient, why want to split into multiple executor, do what good?

CodePudding user response:

Distributed high concurrency, avoid single point of failure (high availability), bypass the performance bottlenecks, single, can pull out a chase

CodePudding user response:

If an executor can not let go,,,

CodePudding user response:

Executor is the meaning of practitioners, each assigned task executor implementation, and then to the driver to carry on the summary,

CodePudding user response:

An executor is a JVM process, and the task is in excutor thread, share some resources, spark a multithreaded model, compare the graphs of each task in the execution will write disk release resources, after the completion of the spark is memory resources of the same excutor release, not directly for the use of the next threads in the same excutor, also according to the executor of the division, as to why multiple excutor, with multiple processes and why
  • Related