Home > Software design >  While trying to connect to MongoDB got exception Class ConnectionString not found
While trying to connect to MongoDB got exception Class ConnectionString not found

Time:10-09

I am trying to connect to MongoDB to write a collection. The spark session was created correctly but whe I try to insert the data into Mongo I get an error in:

MongoSpark.write(stringPeriodosDataframe).option("collection", "periodos").mode("overwrite").save();

the Exception I got:

Exception in thread "main" java.lang.NoClassDefFoundError: com/mongodb/ConnectionString
    at com.mongodb.spark.config.MongoCompanionConfig$$anonfun$4.apply(MongoCompanionConfig.scala:278)
    at com.mongodb.spark.config.MongoCompanionConfig$$anonfun$4.apply(MongoCompanionConfig.scala:278)
    at scala.util.Try$.apply(Try.scala:192)
    at com.mongodb.spark.config.MongoCompanionConfig$class.connectionString(MongoCompanionConfig.scala:278)
    at com.mongodb.spark.config.WriteConfig$.connectionString(WriteConfig.scala:37)
    at com.mongodb.spark.config.WriteConfig$.apply(WriteConfig.scala:209)
    at com.mongodb.spark.config.WriteConfig$.apply(WriteConfig.scala:37)
    at com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:124)
    at com.mongodb.spark.config.WriteConfig$.apply(WriteConfig.scala:37)
    at com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:113)
    at com.mongodb.spark.config.WriteConfig$.apply(WriteConfig.scala:37)
    at com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:81)
    at org.apache.spark.sql.execution.datasources.SaveIntoDataSourceCommand.run(SaveIntoDataSourceCommand.scala:46)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
    at org.apache.spark.sql.execution.command.ExecutedCommandExec.doExecute(commands.scala:86)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:131)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$execute$1.apply(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.SparkPlan$$anonfun$executeQuery$1.apply(SparkPlan.scala:155)
    at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
    at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:152)
    at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:127)
    at org.apache.spark.sql.execution.QueryExecution.toRdd$lzycompute(QueryExecution.scala:80)
    at org.apache.spark.sql.execution.QueryExecution.toRdd(QueryExecution.scala:80)
    at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
    at org.apache.spark.sql.DataFrameWriter$$anonfun$runCommand$1.apply(DataFrameWriter.scala:654)
    at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
    at org.apache.spark.sql.DataFrameWriter.runCommand(DataFrameWriter.scala:654)
    at org.apache.spark.sql.DataFrameWriter.saveToV1Source(DataFrameWriter.scala:273)
    at org.apache.spark.sql.DataFrameWriter.save(DataFrameWriter.scala:267)
    at com.santander.espana.CargarPeriodos.insertCollectionPeriodoDataframe(CargarPeriodos.java:117)
    at com.santander.espana.CargarPeriodos.build(CargarPeriodos.java:71)
    at com.santander.espana.MainCargaPeriodos.init(MainCargaPeriodos.java:38)
    at com.santander.espana.Prueba.main(Prueba.java:50)
Caused by: java.lang.ClassNotFoundException: com.mongodb.ConnectionString
    at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
    ... 34 more
Disconnected from the target VM, address: '127.0.0.1:60130', transport: 'socket'

Process finished with exit code 1

Versions:

mongo-spark-connector_2.11-2.3.0
Java 1.8
IntelliJ 2021 1.2 Community
Spark library versions 2.11

other dependency versions I am using:

hadoop 2.7, spark 2.3.0, java driver 2.7, spark catalyst,core,hive,sql ....all 2.11:2.3.0, scala scala-library:2.11.12

MongoDB 4.0.16

Stuck with this, any help is more than welcome

Thanks!

CodePudding user response:

Finally the solution provided here: mongodb spark connector issue

works!

I used the latest version: mongo-java-driver-3.12.10

  • Related