Val spark=new SparkContext (" spark://miluo1:7077 ", "spark Pi", "/usr/spark - 1.3.1")
Spark. AddJar (" C: \ \ Users \ \ root \ \ Desktop/IO jar ")
Val sc=spark. TextFile (" file:/root/2 TXT ")
Var SSS=sc. First ()
Println (SSS)
Spark. Stop ()
Is the code above, I am under the Windows eclipse (with scala plugin) run directly, in a remote submitted, but can't read the file, and if it is the spark://miluo1:7077 into the local (local) there is no issue,
Below is the error:
1. The eclipse error:
15/04/29 10:45:59 INFO SparkContext: Created broadcast from 0 textFile at SparkJava. Java: 21
The Exception in the thread "main" org. Apache. Hadoop. Mapred. InvalidInputException: Input the path does not exist: TXT file:/root/2
At org, apache hadoop. Mapred. FileInputFormat. SingleThreadedListStatus (FileInputFormat. Java: 285)
At org, apache hadoop. Mapred. FileInputFormat. ListStatus (FileInputFormat. Java: 228)
At org, apache hadoop. Mapred. FileInputFormat. GetSplits (FileInputFormat. Java: 313)
The at org. Apache. Spark. RDD. HadoopRDD. GetPartitions (HadoopRDD. Scala: 203)
2. Log an error (work) on the node
15/04/29 10:23:49 ERROR FileAppender: ERROR writing the stream to the file/usr/spark - 1.3.1/work/app - 20150429102347-0046/0/st
Derrjava. IO. IOException: Stream closed
The at Java. IO. BufferedInputStream. GetBufIfOpen (BufferedInputStream. Java: 162)
The at Java. IO. BufferedInputStream. Read1 (BufferedInputStream. Java: 272)
The at Java. IO. BufferedInputStream. Read (BufferedInputStream. Java: 334)
The at Java. IO. FilterInputStream. Read (FilterInputStream. Java: 107)
The at org. Apache. Spark. Util. Logging. FileAppender. AppendStreamToFile (FileAppender. Scala: 70)
The at org. Apache. Spark. Util. Logging. The FileAppender $$-anon $1 $$anonfun $run $1. Apply $MCV $sp (FileAppender. Scala: 39)
The at org. Apache. Spark. Util. Logging. The FileAppender $$-anon $1 $$anonfun $run $1. Apply (FileAppender. Scala: 39)
The at org. Apache. Spark. Util. Logging. The FileAppender $$-anon $1 $$anonfun $run $1. Apply (FileAppender. Scala: 39)
The at org. Apache. Spark. Util. Utils $. LogUncaughtExceptions (Utils. Scala: 1618)
The at org. Apache. Spark. Util. Logging. FileAppender $$$1. -anon run (FileAppender. Scala: 38)
15/04/29 10:23:49 INFO Worker: Executor app - 20150429102347-0046/0 finished with state KILLED exitStatus 143
15/04/29 10:23:49 INFO Worker: Cleaning up the local directories for application app - 20150429102347-0046
Have not done remotely submit, great god, and before I ready hadoop remote job submission and integration of the web project,,, hope everybody many directions under
CodePudding user response:
Check your spark port 7077 should be the spark you are worthy of not you look at this port configuration file I encountered such a problem because the spark port write wrongCodePudding user response:
Can read just a ghost, you put the file:/root/2 TXT file to each work nodeCodePudding user response:
Could you tell me the problem then you solve? I also encountered this problem,CodePudding user response:
I later discovered, textFile local file should be read, that is you on Windows files, and then submit to the cluster, rather than submit to the cluster, the cluster under the various nodes of the path to read the file again,CodePudding user response:
The