Home > other >  Spark jar read HDFS file error
Spark jar read HDFS file error

Time:09-26

The Exception in The thread "main" org. Apache. Spark. SparkException: Job aborted due to stage a failure: Task 0 in stage 0.0 failed 4 times, most recent failure: Lost Task in stage 0.0 0.3 (dar 3, 10.10.10.154) : org.. Apache hadoop. HadoopIllegalArgumentException: The short - circuit local reads feature is enabled but DFS. Domain. The socket. The path is not set.


Under Caused by: org, apache hadoop. HadoopIllegalArgumentException: The short - circuit local reads feature is enabled but DFS. Domain. The socket. The path is not set.

Seek a solution

CodePudding user response:

Ordinary wordcount case, IntelliJ idea write packaged into a jar to use

Bin/spark - submit \
- class WCount \
- master spark://is - nn - 01:8 888 \
- the name wordcountByScala \
- executor - memory 1 g \
- total - executor - cores 2 \
/spark_test WordCount. Jar \
HDFS://is - nn - 020/spark_test/TST 01:8. TXT

After submit prompt warning

CodePudding user response:

HDFS - site.
in the XML
DFS. Domain. Socket. Path
The/var/run/HDFS - sockets/dn

Had already set
But the path of dn file cannot be opened, has to do with this?
If relevant, the normal dn file should be what? Long what kind?

CodePudding user response:

Whether the worker node configuration DFS. Domain. The socket. The path

CodePudding user response:

All nodes are configured

CodePudding user response:

Spark - submit a parameter - the properties - the file the file Path to a file from which to load extra properties. If not specified, this will look for the conf/spark - defaults. Conf. The machine will be the file comments into temp, system cannot see parameters, lead to appear afore-mentioned problems
  • Related