Home > other >  The spark - 1.2.0 master - worker communication problems
The spark - 1.2.0 master - worker communication problems

Time:09-30

I deploy a Spark - 1.2.0 cluster master (1-3 worker), use the start - all the sh cluster starts there is no problem, also can see the worker on webui state,
But I submit tasks to the cluster or is the time to start the spark - shell, the master will keep quoted error is as follows:
[ERROR] [Logging. Scala: 75] logError: Asked to remove non - existent executor 0
[ERROR] [Logging. Scala: 75] logError: Asked to remove non - existent executor 1
[ERROR] [Logging. Scala: 75] logError: Asked to remove non - existent executor 2
[ERROR] [Logging. Scala: 75] logError: Asked to remove non - existent executor 3
.

The worker nodes in the Error log is:
[ERROR] [Logging.scala:96] logError: Error running executor java.io.IOException: Cannot run program "/bin/java" (in directory "/usr/local/spark-1.2.0/work/app-20150113194629-0001/9"): error=2, 没有那个文件或目录 at java.lang.ProcessBuilder.start(ProcessBuilder.java:1048) at org.apache.spark.deploy.worker.ExecutorRunner.fetchAndRunExecutor(ExecutorRunner.scala:135) at org.apache.spark.deploy.worker.ExecutorRunner$$anon$1.run(ExecutorRunner.scala:65) Caused by: java.io.IOException: error=2, 没有那个文件或目录 at java.lang.UNIXProcess.forkAndExec(Native Method) at java.lang.UNIXProcess. (UNIXProcess. Java: 187) at Java. Lang. ProcessImpl. Start (ProcessImpl. Java: 134) at Java. Lang. ProcessBuilder. Start (1029) the ProcessBuilder. Java:

I didn't find Google for a long time had a similar problem, hope the teacher can help me.
thank you

CodePudding user response:

Resolved
Originally is not communication problem, look at the log, is said in the specified location (SPARK_HOME/work/app_xxxxxxx/0) cannot execute '/bin/Java, immediately feel or environment setting problem, come back to check SPARK_HOME/conf/spark - env. Sh, document first line configuration in the JAVA_HOME=${JAVA_HOME}, thought the JAVA_HOME configuration so that we can get to system Settings, but the output once found that centOS or Mac OS in JDK is installed myself, I do not know why, this configuration is invalid in the spark,
Will spark - env. The JAVA_HOME configuration commented out in the sh can normal boot spark and submit a spark task,

CodePudding user response:

Top, for subsequent when necessary to look at it
  • Related