Slave file is set to: 10.0.15.105 and 10.0.15.106
The spark - env. Sh is set to:
Export SCALA_HOME=/app/spark/scala - 2.10.3
Export HADOOP_HOME=/app/hadoop - 2.2.0
Export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
SPARK_WORKER_INSTANCES=3
SPARK_MASTER_PORT=8081
SPARK_MASTER_WEBUI_PORT=8090
SPARK_WORKER_PORT=8091
SPARK_MASTER_IP=10.0.15.104
SPARK_WORKER_DIR=/app/spark/spark - 0.9.1 - bin - hadoop2/worker
But when I call, in accordance with the following to call JavaSparkContext always error,
String master="spark://10.0.15.104:8081";
String sparkHome="/app/spark/spark - 0.9.1 - bin - hadoop2";
String appName="JavaWordCount";
String [] jarArray=JavaSparkContext. JarOfClass (WordCount. Class);
JavaSparkContext CTX=new JavaSparkContext (master, appName,
SparkHome jarArray);
Results have been submitted to the following error,/220.250.64.18:0 the IP don't know where come of, I simply set the
Looking forward to a great god day and night to guide
The Exception in the thread "main" org.jboss.net ty. Channel. ChannelException: Failed to bind to:/220.250.64.18:0
Ty at org.jboss.net. The bootstrap. ServerBootstrap. Bind (ServerBootstrap. Java: 272)
The at akka.remote.transport.netty.Net tyTransport $$$listen anonfun $1. Apply (NettyTransport. Scala: 391)
The at akka.remote.transport.netty.Net tyTransport $$$listen anonfun $1. Apply (NettyTransport. Scala: 388)
At the scala. Util. Success $$anonfun $map $1. Apply (206). The Try scala:
At the scala. Util. Try $. Apply (161). The Try scala:
At the scala. Util. Success. The map (206). The Try scala:
At the scala. Concurrent. Map for Future $$anonfun $$1. Apply (235). The Future scala:
At the scala. Concurrent. Map for Future $$anonfun $$1. Apply (235). The Future scala:
At the scala. Concurrent. Impl. CallbackRunnable. Run (Promise. Scala: 32)
At akka. Dispatch. BatchingExecutor $$$anonfun $run Batch $1. ProcessBatch $1 (67). BatchingExecutor scala:
The at akka. Dispatch. BatchingExecutor $$$anonfun $run Batch $1. Apply $MCV $sp (BatchingExecutor. Scala: 82)
The at akka. Dispatch. BatchingExecutor $$$anonfun $run Batch $1. Apply (BatchingExecutor. Scala: 59)
The at akka. Dispatch. BatchingExecutor $$$anonfun $run Batch $1. Apply (BatchingExecutor. Scala: 59)
At the scala. Concurrent. BlockContext $. WithBlockContext (BlockContext. Scala: 72)
At akka. Dispatch. BatchingExecutor $Batch. The run (58) BatchingExecutor. Scala:
The at akka. Dispatch. TaskInvocation. Run (AbstractDispatcher. Scala: 42)
The at akka. Dispatch. ForkJoinExecutorConfigurator $AkkaForkJoinTask. Exec (AbstractDispatcher. Scala: 386)
At the scala. Concurrent. Forkjoin. ForkJoinTask. DoExec (ForkJoinTask. Java: 260)
At the scala. Concurrent. Forkjoin. ForkJoinPool $WorkQueue. RunTask (ForkJoinPool. Java: 1339)
At the scala. Concurrent. Forkjoin. ForkJoinPool. RunWorker (ForkJoinPool. Java: 1979)
At the scala. Concurrent. Forkjoin. ForkJoinWorkerThread. Run (107) ForkJoinWorkerThread. Java:
Under Caused by: java.net.BindException: unable to specify the requested address
The at sun.nio.ch.Net.bind (Native Method)
At sun. Nio. Ch. ServerSocketChannelImpl. Bind (ServerSocketChannelImpl. Java: 124)
At sun. Nio. Ch. ServerSocketAdaptor. Bind (ServerSocketAdaptor. Java: 59)
At org.jboss.net ty. Channel. Socket. Nio. NioServerBoss $RegisterTask. Run (193) NioServerBoss. Java:
At org.jboss.net ty. Channel. Socket. Nio. AbstractNioSelector. ProcessTaskQueue (AbstractNioSelector. Java: 366)
At org.jboss.net ty. Channel. Socket. Nio. AbstractNioSelector. Run (290) AbstractNioSelector. Java:
At org.jboss.net ty. Channel. Socket. Nio. NioServerBoss. Run (42 NioServerBoss. Java:)
The at Java. Util. Concurrent. ThreadPoolExecutor $Worker. RunTask (ThreadPoolExecutor. Java: 895)
The at Java. Util. Concurrent. ThreadPoolExecutor $Worker. The run (ThreadPoolExecutor. Java: 918)
The at Java. Lang. Thread. The run (Thread. Java: 662)
CodePudding user response:
Export SPARK_MASTER_IP=localhostExport SPARK_LOCAL_IP=localhost
CodePudding user response:
Advice when you connect the master as far as possible with the hostname to connect, or you will not even,