Home > other >  NoClassDefFoundError: org/apache/spark/SparkConf
NoClassDefFoundError: org/apache/spark/SparkConf

Time:09-18

 

/opt/jdk1.8.0_121/bin/java -javaagent:/opt/idea-IC-171.4249.39/lib/idea_rt.jar=41331:/opt/idea-IC-171.4249.39/bin -Dfile.encoding=UTF-8 -classpath/opt/jdk1.8.0_121/jre/lib/charsets.jar:/opt/jdk1.8.0_121/jre/lib/deploy.jar:/opt/jdk1.8.0_121/jre/lib/ext/cldrdata.jar:/opt/jdk1.8.0_121/jre/lib/ext/dnsns.jar:/opt/jdk1.8.0_121/jre/lib/ext/jaccess.jar:/opt/jdk1.8.0_121/jre/lib/ext/jfxrt.jar:/opt/jdk1.8.0_121/jre/lib/ext/localedata.jar:/opt/jdk1.8.0_121/jre/lib/ext/mysql-connector-java-5.1.40-bin.jar:/opt/jdk1.8.0_121/jre/lib/ext/nashorn.jar:/opt/jdk1.8.0_121/jre/lib/ext/sunec.jar:/opt/jdk1.8.0_121/jre/lib/ext/sunjce_provider.jar:/opt/jdk1.8.0_121/jre/lib/ext/sunpkcs11.jar:/opt/jdk1.8.0_121/jre/lib/ext/zipfs.jar:/opt/jdk1.8.0_121/jre/lib/javaws.jar:/opt/jdk1.8.0_121/jre/lib/jce.jar:/opt/jdk1.8.0_121/jre/lib/jfr.jar:/opt/jdk1.8.0_121/jre/lib/jfxswt.jar:/opt/jdk1.8.0_121/jre/lib/jsse.jar:/opt/jdk1.8.0_121/jre/lib/management-agent.jar:/opt/jdk1.8.0_121/jre/lib/plugin.jar:/opt/jdk1.8.0_121/jre/lib/resources.jar:/opt/jdk1.8.0_121/jre/lib/rt.jar:/home/hadoop/IdeaProjects/Spark_19/target/scala-2.11/classes:/home/hadoop/.ivy2/cache/org.scala-lang/scala-library/jars/scala-library-2.11.0.jar SparkPi
The Exception in the thread "main" Java. Lang. NoClassDefFoundError: org/apache/spark/SparkConf
The at SparkPi $. The main (SparkPi. Scala: 15)
The at SparkPi. Main (SparkPi. Scala)
Under Caused by: Java. Lang. ClassNotFoundException: org. Apache. Spark. SparkConf
The at java.net.URLClassLoader.findClass URLClassLoader. Java: (381)
The at Java. Lang. This. LoadClass (424). This Java:
At sun. Misc. The Launcher $AppClassLoader. LoadClass (331). The Launcher Java:
The at Java. Lang. This. LoadClass (357). This Java:
. 2 more

The Process finished with exit code 1

CodePudding user response:

Code:
 

The import org. Apache. Spark. SparkConf
The import org. Apache. Spark. SparkContext

The import scala. Collection. The mutable. ArrayBuffer
The import scala. Math. The random

The object SparkPi {
Def main (args: Array [String]) {
Val jar: String=""
Val jars=ArrayBuffer [String] ()
Jars +=jar
Val conf=new SparkConf (). SetMaster (" spark://master: 7077 "). The setAppName (" spark Pi "). SetJars (jars)
Val spark=new SparkContext (conf)
Val slices=the if (args. Length & gt; 0) the args (0). ToInt else 2
Val n=math. Min (100000 l * slices, Int. J MaxValue). ToInt
Val count=spark. Parallelize (1 until n, slices). The map {I=& gt;
Val x=the random * 2-1
Val y=the random * 2-1
If (x * x + y * y & lt; 1) 1 else 0
}. Reduce + _) (_
Println (" Pi is roughly "+ 4.0 * count/n)
Spark. Stop ()
}
}

CodePudding user response:

This is what reason is caused?

CodePudding user response:

https://blog.csdn.net/qq_29269907/article/details/83746542

Look at this article is available to solve your problem
  • Related