Home > other >  Spark cluster set up times TimeoutException how be to return a responsibility?
Spark cluster set up times TimeoutException how be to return a responsibility?

Time:10-01

Build spark cluster, the cluster starts, beginning at the node JPS check process will be shown on the master and the worker, once again after one minute the JPS two processes is lost, the log is:
Spark an assembly has had been built with Hive, o Datanucleus jars on the classpath
Spark Command:/usr/lib/java/jdk1.7.0_71/bin/java -cp ::/usr/local/spark/spark-1.0.0-bin-hadoop1/conf:/usr/local/spark/spark-1.0.0-bin-hadoop1/lib/spark-assembly-1.0.0-hadoop1.0.4.jar:/usr/local/spark/spark-1.0.0-bin-hadoop1/lib/datanucleus-rdbms-3.2.1.jar:/usr/local/spark/spark-1.0.0-bin-hadoop1/lib/datanucleus-api-jdo-3.2.1.jar:/usr/local/spark/spark-1.0.0-bin-hadoop1/lib/datanucleus-core-3.2.2.jar:/usr/local/hadoop/hadoop-1.2.1/conf -XX:MaxPermSize=128m -Dspark.akka.logLifecycleEvents=true -Xms512m -Xmx512m org.apache.spark.deploy.master.Master --ip 192.168.81.128 --port 7077 --webui-port 8080
========================================

14/12/03 11:34:55 INFO spark. SecurityManager: Changing the view acls, root
14/12/03 11:34:55 INFO spark. SecurityManager: SecurityManager: authentication disabled; The UI acls disabled; The users with the view permissions: Set (root)
14/12/03 11:35:44 INFO slf4j. Slf4jLogger: Slf4jLogger started
14/12/03 11:35:48 INFO Remoting: Starting the Remoting
The Exception in the thread "main" Java. Util. Concurrent. TimeoutException: Futures timed out after 10000 milliseconds []
At the scala. Concurrent. Impl. Promise $DefaultPromise. Ready (219) Promise. Scala:
At the scala. Concurrent. Impl. Promise $DefaultPromise. Result (223) Promise. Scala:
At the scala. Concurrent. Await $$anonfun $result $1. Apply (107) package. The scala:
At the scala. Concurrent. BlockContext $DefaultBlockContext $. BlockOn (BlockContext. Scala: 53)
At the scala. Concurrent. Await $. The result (107) package. The scala:
The at akka. Remote. Remoting. Start (Remoting. Scala: 173)
The at akka. Remote. RemoteActorRefProvider. Init (RemoteActorRefProvider. Scala: 184)
The at akka. Actor. ActorSystemImpl. _start $lzycompute (ActorSystem. Scala: 579)
The at akka. Actor. ActorSystemImpl. _start (ActorSystem. Scala: 577)
The at akka. Actor. ActorSystemImpl. Start (ActorSystem. Scala: 588)
The at akka. Actor. ActorSystem $. Apply (ActorSystem. Scala: 111)
The at akka. Actor. ActorSystem $. Apply (ActorSystem. Scala: 104)
The at org. Apache. Spark. Util. AkkaUtils $. CreateActorSystem (AkkaUtils. Scala: 104)
The at org. Apache. Spark. Deploy. Master. The master $. StartSystemAndActor (785). The master scala:
The at org. Apache. Spark. Deploy. Master. The master $. The main (765). The master scala:
The at org. Apache. Spark. Deploy. Master. Master. The main (master. Scala)
Pray god to solve,,,

CodePudding user response:

What is your Hadoop version?

CodePudding user response:

reference 1st floor wulinshishen response:
Hadoop version is you?
Hadoop is 1.2.1, spark is 1.0.0
  • Related