Home > Software design >  sbt run get an erro when compiling after adding dependencies? in ubuntu
sbt run get an erro when compiling after adding dependencies? in ubuntu

Time:09-17

I have added following dependencies to built.sbt, after running the sbt run in terminal, I got the bellow error:

$ sbt run
[info] welcome to sbt 1.5.5 (Private Build Java 1.8.0_292)
[info] loading global plugins from /home/hayat/.sbt/1.0/plugins
[info] loading project definition from /home/hayat/myproject/project
[info] loading settings for project root from build.sbt ...
[info] set current project to scala3-simple (in build file:/home/hayat/myproject/)
[info] Updating 
[info] Resolved  dependencies
[warn] 
[warn]  Note: Unresolved dependencies path:
[error] sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-streaming:3.1.2
[error]   Not found
[error]   Not found
[error]   not found: /home/hayat/.ivy2/localorg.apache.spark/spark-streaming/3.1.2/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/spark/spark-streaming/3.1.2/spark-streaming-3.1.2.pom
[error]     at lmcoursier.CoursierDependencyResolution.unresolvedWarningOrThrow(CoursierDependencyResolution.scala:258)
[error]     at lmcoursier.CoursierDependencyResolution.$anonfun$update$38(CoursierDependencyResolution.scala:227)
[error]     at scala.util.Either$LeftProjection.map(Either.scala:573)
[error]     at lmcoursier.CoursierDependencyResolution.update(CoursierDependencyResolution.scala:227)
[error]     at sbt.librarymanagement.DependencyResolution.update(DependencyResolution.scala:60)
[error]     at sbt.internal.LibraryManagement$.resolve$1(LibraryManagement.scala:59)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$12(LibraryManagement.scala:133)
[error]     at sbt.util.Tracked$.$anonfun$lastOutput$1(Tracked.scala:73)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$20(LibraryManagement.scala:146)
[error]     at scala.util.control.Exception$Catch.apply(Exception.scala:228)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11(LibraryManagement.scala:146)
[error]     at sbt.internal.LibraryManagement$.$anonfun$cachedUpdate$11$adapted(LibraryManagement.scala:127)
[error]     at sbt.util.Tracked$.$anonfun$inputChangedW$1(Tracked.scala:219)
[error]     at sbt.internal.LibraryManagement$.cachedUpdate(LibraryManagement.scala:160)
[error]     at sbt.Classpaths$.$anonfun$updateTask0$1(Defaults.scala:3678)
[error]     at scala.Function1.$anonfun$compose$1(Function1.scala:49)
[error]     at sbt.internal.util.$tilde$greater.$anonfun$$u2219$1(TypeFunctions.scala:62)
[error]     at sbt.std.Transform$$anon$4.work(Transform.scala:68)
[error]     at sbt.Execute.$anonfun$submit$2(Execute.scala:282)
[error]     at sbt.internal.util.ErrorHandling$.wideConvert(ErrorHandling.scala:23)
[error]     at sbt.Execute.work(Execute.scala:291)
[error]     at sbt.Execute.$anonfun$submit$1(Execute.scala:282)
[error]     at sbt.ConcurrentRestrictions$$anon$4.$anonfun$submitValid$1(ConcurrentRestrictions.scala:265)
[error]     at sbt.CompletionService$$anon$2.call(CompletionService.scala:64)
[error]     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
[error]     at java.util.concurrent.FutureTask.run(FutureTask.java:266)
[error]     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
[error]     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
[error]     at java.lang.Thread.run(Thread.java:748)
[error] (update) sbt.librarymanagement.ResolveException: Error downloading org.apache.spark:spark-streaming:3.1.2
[error]   Not found
[error]   Not found
[error]   not found: /home/hayat/.ivy2/localorg.apache.spark/spark-streaming/3.1.2/ivys/ivy.xml
[error]   not found: https://repo1.maven.org/maven2/org/apache/spark/spark-streaming/3.1.2/spark-streaming-3.1.2.pom
[error] Total time: 7 s, completed Sep 16, 2021 11:21:30 AM

Here is built.sbt:

val scala3Version = "3.0.2"

lazy val root = project
  .in(file("."))
  .settings(
    name := "scala3-simple",
    version := "0.1.0",

    scalaVersion := scala3Version,

    libraryDependencies  = "com.novocode" % "junit-interface" % "0.11" % "test",
    libraryDependencies  = "org.apache.spark" % "spark-streaming" % "3.1.2",
    libraryDependencies  = "org.apache.spark" % "spark-core" % "3.1.2"
  )
  1. Scala version: 3.0.2

  2. Sbt version: 1.5.5

CodePudding user response:

Libraries spark-streaming and spark-core don't exist, it is spark-streaming_2.12 and spark-core_2.12, where 2.12 is the Scala version. Currently there are no spark-streaming_3.0 and spark-core_3.0 libraries.

So to solve your issue, you need to:

  • downgrade your version of scala from 3.0.2 to 2.12.x (latest current version, 2.12.15) as there is no version of Spark for Scala 3
  • use spark-streaming_2.12 library instead of spark-streaming
  • use spark-core_2.12 library instead of spark-core

To use _2.12 version of libraries, you can either add _2.12 to your library name:

libraryDependencies  = "org.apache.spark" % "spark-streaming_2.12" % "3.1.2",
libraryDependencies  = "org.apache.spark" % "spark-core_2.12" % "3.1.2"

or, better, use %% between group and library name to automatically add scala version to library name:

libraryDependencies  = "org.apache.spark" %% "spark-streaming" % "3.1.2",
libraryDependencies  = "org.apache.spark" %% "spark-core" % "3.1.2"

So your build.sbt should become:

val scala2Version = "2.12.15"

lazy val root = project
  .in(file("."))
  .settings(
    name := "scala2-simple",
    version := "0.1.0",

    scalaVersion := scala2Version,

    libraryDependencies  = "com.novocode" % "junit-interface" % "0.11" % "test",
    libraryDependencies  = "org.apache.spark" %% "spark-streaming" % "3.1.2",
    libraryDependencies  = "org.apache.spark" %% "spark-core" % "3.1.2"
  )

  • Related