Home > other >  The spark installation problem
The spark installation problem

Time:09-27

Problem: installed SPARK1.5.1, but is to start an error, please help, under the guidance of the

Error message:
[root @ Master sbin] #./start - all. Sh
Starting org. Apache. Spark. Deploy. Master, master, logging to/usr/local/spark/spark - 1.5.1 - bin - hadoop2.6/sbin/../logs/spark - root - org. Apache. Spark. Deploy. Master. The master - 1 - master. Hadoop. Out
Failed to launch org. Apache. Spark. Deploy. Master. Master:
/usr/local/spark/spark - 1.5.1 - bin - hadoop2.6/sbin/../bin/spark - class: line 78: cygpath: command not found
Error: cannot find or fail to load the main class org. Apache. Spark. Deploy. Master. Master
Full log in/usr/local/spark/spark - 1.5.1 - bin - hadoop2.6/sbin/../logs/spark - root - org. Apache. Spark. Deploy. Master. The master - 1 - master. Hadoop. Out
Slave2. Hadoop: starting org. Apache. Spark. Deploy. Worker. The worker, logging to/usr/local/spark/spark - 1.5.1 - bin - hadoop2.6/sbin/../logs/spark - root - org. Apache. Spark. Deploy. Worker. The worker - 1 - Slave2. Hadoop. Out
Master. Hadoop: starting org. Apache. Spark. Deploy. Worker. The worker, logging to/usr/local/spark/spark - 1.5.1 - bin - hadoop2.6/sbin/../logs/spark - root - org. Apache. Spark. Deploy. Worker. The worker - 1 - Master. Hadoop. Out
Slave1. Hadoop: starting org. Apache. Spark. Deploy. Worker. The worker, logging to/usr/local/spark/spark - 1.5.1 - bin - hadoop2.6/sbin/../logs/spark - root - org. Apache. Spark. Deploy. Worker. The worker - 1 - Slave1. Hadoop. Out
Slave2. Hadoop: failed to launch org. Apache. Spark. Deploy. Worker. The worker:
Slave2. Hadoop: error: cannot find or fail to load the Main class org. Apache. Spark. The launcher. The Main
Slave2. Hadoop: full log in/usr/local/spark/spark - 1.5.1 - bin - hadoop2.6/sbin/../logs/spark - root - org. Apache. Spark. Deploy. Worker. The worker - 1 - Slave2. Hadoop. Out
Master. Hadoop: failed to launch org. Apache. Spark. Deploy. Worker. The worker:
Master. Hadoop:/usr/local/spark/spark - 1.5.1 - bin - hadoop2.6/sbin/../bin/spark - class: line 78: cygpath: command not found
Master. Hadoop: error: cannot find or fail to load the main class org. Apache. Spark. Deploy. Worker. The worker
Master. Hadoop: full log in/usr/local/spark/spark - 1.5.1 - bin - hadoop2.6/sbin/../logs/spark - root - org. Apache. Spark. Deploy. Worker. The worker - 1 - Master. Hadoop. Out
Slave1. Hadoop: failed to launch org. Apache. Spark. Deploy. Worker. The worker:
Slave1. Hadoop: error: cannot find or fail to load the Main class org. Apache. Spark. The launcher. The Main
Slave1. Hadoop: full log in/usr/local/spark/spark - 1.5.1 - bin - hadoop2.6/sbin/../logs/spark - root - org. Apache. Spark. Deploy. Worker. The worker - 1 - Slave1. Hadoop. Out

Environment:
Export JAVA_HOME=/usr/jdk1.7.0 _05
Export JRE_HOME=$JAVA_HOME/jre
Export the CLASSPATH=$JAVA_HOME/lib: $JRE_HOME/lib: $CLASSPATH
Export JAVA_OPTS="- Xms512M - Xmx1024M"
Export CATALINA_OPTS="- Djava. Awt. Headless=true"
HADOOP_HOME=/opt/spark/hadoop - 2.7.1
SCALA_HOME=/usr/local/scala scala - 2.11.7
SPARK_HOME=/usr/local/spark/spark - 1.5.1 - bin - hadoop2.6
The export PATH=$JAVA_HOME/bin: $JRE_HOME/bin: $HADOOP_HOME/bin: $HADOOP_HOME/sbin: $SCALA_HOME/bin: $SPARK_HOME/bin: $PATH

Export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
Export HADOOP_OPTS="- Djava. If the path=$HADOOP_HOME/lib"

CodePudding user response:

Can you tell me the solution? I appeared this problem, the guidance

CodePudding user response:

Failed to launch org. Apache. Spark. Deploy. Master. Master: is this your master is not up, so the back of the have a problem

CodePudding user response:

CLASS_PATH inside, increase the spark jar package path to try

CodePudding user response:

/want to ask the building Lord, to solve this problem? How to solve? Trouble to

CodePudding user response:

SPARK_HOME/sbin join PATH PATH
  • Related