Home > other >  Can't find or unable to load the Main class org. Apache. Spark. The launcher. The Main
Can't find or unable to load the Main class org. Apache. Spark. The launcher. The Main

Time:10-21

Windows environment, installing spark2.4.3, spark - shell, spark - submit all no problem, but to start the start - master times wrong, the log is:

Error: cannot find or fail to load the Main class org. Apache. Spark. The launcher. The Main
Reason: the Java. Lang. ClassNotFoundException: org. Apache. Spark. The launcher. The Main

Look under the jars folder spark - the launcher. This class in the jar is indeed some, where is the problem?

Java version 12.0.4

CodePudding user response:

After I create a new system environment variable SPARK_HOME, this problem has disappeared, but start or not succeed, launch window disappears, the log log is:

Spark the Command: C:/Java/JDK - 12.0.2 \ bin \ Java - cp C: \ Spark - 2.4.3/conf \; C: \ spark - 2.4.3 \ jars \ * - Xmx1g org. Apache. Spark. Deploy. Master. The master - host - port 7077 - webui - port 8080
========================================
C:/Java/JDK - 12.0.2 \ bin \ Java - cp "C: \ spark - 2.4.3/conf \; C: \ spark - 2.4.3 \ jars \ * "- Xmx1g org. Apache. The spark, deploy, master, master, host, port 7077 - webui - port 8080

I don't know what started without success, I'll use intellij idea to visit and see,

CodePudding user response:

Interface prompts the hostname unknown option - f
So in the spark - env. Conf SPARK_MASTER_HOST=159.226.177.27 added a line of
This error is not

But there is a mistake to ps unknown option - seems to be the start - the daemon. O don't know the ps command execution in sh -o


CodePudding user response:

I don't have to install cygwin, which is then program in Windows don't know that informal due to execute the ps command options - o?

The question should be the following:

If the [[$(ps - p "$TARGET_ID" - o comm=)=~ "Java"]]. Then

See if there are any changes to?
  • Related