Home > Net >  Apache Spark Pool Mongodb connector
Apache Spark Pool Mongodb connector

Time:11-04

I have been trying to read/write with synapse spark pools into a mongodb atlas server, i have tried PyMongo but im more interested in using the mongodb spark connector but in the install procedure they use this command:

./bin/pyspark --conf "spark.mongodb.input.uri=mongodb://127.0.0.1/test.myCollection?readPreference=primaryPreferred" \
              --conf "spark.mongodb.output.uri=mongodb://127.0.0.1/test.myCollection" \
              --packages org.mongodb.spark:mongo-spark-connector_2.12:3.0.1

The problem im facing is that Synapse Spark Pools allow for spark session configuration but not for packacges command or use of the spark shell, how can i acomplish this instalation inside a spark pool?

CodePudding user response:

This can be solved by installing the jar directly under workspace packages. Download the jar of the connector and then upload it to synapse, then to the spark pool.

  • Related