Home > Software engineering >  SBT gives error when importing Spark's dependencies
SBT gives error when importing Spark's dependencies

Time:09-21

I'm new in Spark and it is my first test project. I followed tutorial where everything works but when I tried to introduce it into my machine it didn't work. I faced with errors during building a project. I'm using dependencies:

name := "spark"

version := "0.1"

scalaVersion := "2.12.8"

libraryDependencies   = Seq(
  "org.apache.spark" %% "spark-core" % "2.3.3",
  "org.apache.spark" %% "spark-sql" % "2.3.3"
)

While importing dependencies I am getting error:

[error] sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.3.3: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.3.3: not found
[error]     at sbt.internal.librarymanagement.IvyActions$.resolveAndRetrieve(IvyActions.scala:332)
[error]     at sbt.internal.librarymanagement.IvyActions$.$anonfun$updateEither$1(IvyActions.scala:208)
[error]     at sbt.internal.librarymanagement.IvySbt$Module.$anonfun$withModule$1(Ivy.scala:239)
[error]     at sbt.internal.librarymanagement.IvySbt.$anonfun$withIvy$1(Ivy.scala:204)
[error]     at sbt.internal.librarymanagement.IvySbt.sbt$internal$librarymanagement$IvySbt$$action$1(Ivy.scala:70)
[error]     at sbt.internal.librarymanagement.IvySbt$$anon$3.call(Ivy.scala:77)
[error]     at xsbt.boot.Locks$GlobalLock.withChannel$1(Locks.scala:113)
[error]     at xsbt.boot.Locks$GlobalLock.withChannelRetries$1(Locks.scala:91)
[error]     at xsbt.boot.Locks$GlobalLock.$anonfun$withFileLock$1(Locks.scala:119)
[error]     at xsbt.boot.Using$.withResource(Using.scala:12)
[error]     at xsbt.boot.Using$.apply(Using.scala:9)
[error]     at xsbt.boot.Locks$GlobalLock.withFileLock(Locks.scala:119)
[error]     at xsbt.boot.Locks$GlobalLock.ignoringDeadlockAvoided(Locks.scala:71)
[error]     at xsbt.boot.Locks$GlobalLock.withLock(Locks.scala:59)
[error]     at xsbt.boot.Locks$.apply0(Locks.scala:47)
[error]     at xsbt.boot.Locks$.apply(Locks.scala:36)
[error]     at sbt.internal.librarymanagement.IvySbt.withDefaultLogger(Ivy.scala:77)
[error]     at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:199)
[error]     at sbt.internal.librarymanagement.IvySbt.withIvy(Ivy.scala:196)
[error]     at sbt.internal.librarymanagement.IvySbt$Module.withModule(Ivy.scala:238)
[error]     at sbt.internal.librarymanagement.IvyActions$.updateEither(IvyActions.scala:193)
[error]     at 

...
...
...

[error] (update) sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.3.3: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.3.3: not found
[error] (ssExtractDependencies) sbt.librarymanagement.ResolveException: unresolved dependency: org.apache.spark#spark-core_2.12;2.3.3: not found
[error] unresolved dependency: org.apache.spark#spark-sql_2.12;2.3.3: not found
[error] Total time: 1 s, completed 18-Sep-2021 10:33:42
[info] shutting down server

CodePudding user response:

It looks like spark 2.3 needs to a compatible scala version, try using 2.11.x as a scala version.

Source : [sparkDocs]https://spark.apache.org/docs/2.3.0/

`Spark runs on Java 8 , Python 2.7 /3.4 and R 3.1 . For the Scala API, Spark 2.3.0 uses Scala 2.11. You will need to use a compatible Scala version (2.11.x).

Note that support for Java 7, Python 2.6 and old Hadoop versions before 2.6.5 were removed as of Spark 2.2.0. Support for Scala 2.10 was removed as of 2.3.0.`

  • Related