Home > Back-end >  Apache Spark - Quick Start "java.lang.NoClassDefFoundError: scala/Serializable"
Apache Spark - Quick Start "java.lang.NoClassDefFoundError: scala/Serializable"

Time:12-02

I am trying to follow this guide https://spark.apache.org/docs/latest/quick-start.html (scala). However, I cant complete the last step when I'm supposed to submit the jar file to spark.

# Use spark-submit to run your application
$ YOUR_SPARK_HOME/bin/spark-submit \
  --class "SimpleApp" \
  --master local[4] \
  target/scala-2.12/simple-project_2.12-1.0.jar

I get the following exception


Exception in thread "main" java.lang.BootstrapMethodError: java.lang.NoClassDefFoundError: scala/Serializable
        at SimpleApp$.main(SimpleApp.scala:9)
        at SimpleApp.main(SimpleApp.scala)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
        at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
        at java.lang.reflect.Method.invoke(Unknown Source)
        at org.apache.spark.deploy.JavaMainApplication.start(SparkApplication.scala:52)
        at org.apache.spark.deploy.SparkSubmit.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:958)
        at org.apache.spark.deploy.SparkSubmit.doRunMain$1(SparkSubmit.scala:180)
        at org.apache.spark.deploy.SparkSubmit.submit(SparkSubmit.scala:203)
        at org.apache.spark.deploy.SparkSubmit.doSubmit(SparkSubmit.scala:90)
        at org.apache.spark.deploy.SparkSubmit$$anon$2.doSubmit(SparkSubmit.scala:1046)
        at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:1055)
        at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.NoClassDefFoundError: scala/Serializable
        ... 14 more
Caused by: java.lang.ClassNotFoundException: scala.Serializable
        at java.net.URLClassLoader.findClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        at java.lang.ClassLoader.loadClass(Unknown Source)
        ... 14 more

Any idea what is causing this?

CodePudding user response:

You need to upgrade the dependencies to versions that are binary compatible to your Scala version. Looks like 2.12 in this case.

CodePudding user response:

I found the problem. I had the wrong Spark version installed. I had downloaded the "Pre-built for Apache Hadoop 3.3 and later (scala 2.13)" version. Installing the "Pre-built for Apache Hadoop 3.3 and later" spark version solved the problem.

  • Related