Under Caused by: org, apache hadoop. HadoopIllegalArgumentException: The short - circuit local reads feature is enabled but DFS. Domain. The socket. The path is not set.
Seek a solution
CodePudding user response:
Ordinary wordcount case, IntelliJ idea write packaged into a jar to useBin/spark - submit \
- class WCount \
- master spark://is - nn - 01:8 888 \
- the name wordcountByScala \
- executor - memory 1 g \
- total - executor - cores 2 \
/spark_test WordCount. Jar \
HDFS://is - nn - 020/spark_test/TST 01:8. TXT
After submit prompt warning
CodePudding user response:
HDFS - site.in the XML
Had already set
But the path of dn file cannot be opened, has to do with this?
If relevant, the normal dn file should be what? Long what kind?
CodePudding user response:
Whether the worker node configuration DFS. Domain. The socket. The pathCodePudding user response:
All nodes are configuredCodePudding user response:
Spark - submit a parameter - the properties - the file the file Path to a file from which to load extra properties. If not specified, this will look for the conf/spark - defaults. Conf. The machine will be the file comments into temp, system cannot see parameters, lead to appear afore-mentioned problems