Home >
other > Very confusing, why must the hadoop - env. Sh in the JAVA_HOME environment variable configuration? T
Very confusing, why must the hadoop - env. Sh in the JAVA_HOME environment variable configuration? T
Both hadoop and spark, there is a env configuration files, we must in hadoop - env. Write down the export in sh JAVA_HOME=/XXX/XXX/JDK, so our cluster can normal boot, I am a mystery that clearly our system already exists in the environment variable JAVA_HOME environment variable, if not in hadoop - env. Sh to export, still complains, why hadoop can't read the default system environment variables, but must be in hadoop - env. Reset in sh? Hadoop - env. Sh when executed? What role it played?