Home > other >  The spark to start
The spark to start

Time:09-23

Spark1.6.0 starts, encounter problems, please help teach

[root @ Master spark - 1.6.0 - bin - hadoop2.6] # sbin/start - all. Sh
Starting org. Apache. Spark. Deploy. Master, master, logging to/home/softwares/spark - 1.6.0 - bin - hadoop2.6/logs/spark - root - org. Apache. Spark. Deploy. Master. The master - 1 - master. Hadoop. Out
Failed to launch org. Apache. Spark. Deploy. Master. Master:
/bin/Java: no files or directories - bin - hadoop2.6/bin/spark - class: line 86:/home/softwares/jdk1.7.0 _79
Full log in/home/softwares/spark - 1.6.0 - bin - hadoop2.6/logs/spark - root - org. Apache. Spark. Deploy. Master. The master - 1 - master. Hadoop. Out
Slave1. Hadoop: starting org. Apache. Spark. Deploy. The worker, the worker, logging to/home/softwares/spark - 1.6.0 - bin - hadoop2.6/logs/spark - root - org. Apache. Spark. Deploy. Worker. The worker - 1 - Slave1. Hadoop. Out
Slave2. Hadoop: starting org. Apache. Spark. Deploy. The worker, the worker, logging to/home/softwares/spark - 1.6.0 - bin - hadoop2.6/logs/spark - root - org. Apache. Spark. Deploy. Worker. The worker - 1 - Slave2. Hadoop. Out
Slave2. Hadoop: failed to launch org. Apache. Spark. Deploy. Worker. The worker:
Slave1. Hadoop: failed to launch org. Apache. Spark. Deploy. Worker. The worker:
/bin/Java: no files or directories res/spark - 1.6.0 - bin - hadoop2.6/bin/spark - class: line 86:/home/softwares/jdk1.7.0 _79
/bin/Java: no files or directories res/spark - 1.6.0 - bin - hadoop2.6/bin/spark - class: line 86:/home/softwares/jdk1.7.0 _79
Slave2. Hadoop: full log in/home/softwares/spark - 1.6.0 - bin - hadoop2.6/logs/spark - root - org. Apache. Spark. Deploy. Worker. The worker - 1 - Slave2. Hadoop. Out
Slave1. Hadoop: full log in/home/softwares/spark - 1.6.0 - bin - hadoop2.6/logs/spark - root - org. Apache. Spark. Deploy. Worker. The worker - 1 - Slave1. Hadoop. Out
[root @ Master spark - 1.6.0 - bin - hadoop2.6]

The configuration file:

Export JAVA_HOME=/home/softwares/jdk1.7.0 _79

Export SCALA_HOME=/home/softwares/scala - 2.10.6

Export HADOOP_HOME=/home/softwares/hadoop - server

Export HADOOP_CONF_DIR=${HADOOP_HOME}/etc/hadoop

Export SPARK_MASTER_IP=59.67.152.31

Export SPARK_WORKER_MEMORY=2 g
Export master=spark://59.67.152.31

Export SPARK_EXECUTOR_MEMORY=2 g

Export SPARK_DRIVER_MEMORY=2 g

Export SPARK_WORKDER_CORES=2

The environment variable:
Export JAVA_HOME=/home/softwares/jdk1.7.0 _79
Export HADOOP_HOME=/home/softwares/hadoop - server
Export SCALA_HOME=/home/softwares/scala - 2.10.6
Export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
Export HADOOP_OPTS="- Djava. If the path=$HADOOP_HOME/lib"
Export HIVE_HOME=/home/softwares/apache - hive - 1.2.1 - bin
Export SPARK_HOME=/home/softwares/spark - 1.6.0 - bin - hadoop2.6
The export PATH=$PATH: $JAVA_HOME/bin: $HIVE_HOME/bin: $FLUME_HOME/bin: $SCALA_HOME/bin: ${SPARK_HOME}/bin: ${SPARK_HOME}/sbin: $PATH
Export HIVE_CONF_DIR=${HIVE_HOME}/conf
Export FLUME_HOME=/home/softwares/apache - the flume 1.6.0 - bin
Export FLUME_CONF_DIR=$FLUME_HOME/conf
Export the CLASSPATH=. : $CLASSPATH: $HIVE_HOME/lib: $SPARK_HOME/lib



CodePudding user response:

Vi spark - env. Sh found unknown character profile, directly open the configuration file not found error, delete the characters do not know why, restart the success,
  • Related