Home > other >  The spark mongo problem
The spark mongo problem

Time:10-08

SparkSession spark=sparkMongoDao. GetSession ();
JavaSparkContext JSC=new JavaSparkContext (spark sparkContext ());
Map ReadOverrides=new HashMap (a);
ReadOverrides. Put (" collection ", "stock_basic_info1");
ReadOverrides. Put (" readPreference. The name ", "secondaryPreferred");
ReadConfig ReadConfig=ReadConfig. Create (JSC). WithOptions (readOverrides);
Dataset ImplicitDS=MongoSpark. Load (JSC, readConfig). ToDF ();
ImplicitDS. PrintSchema ();
ImplicitDS. The show ();


I wrote the following, however, but this line (ReadConfig ReadConfig=ReadConfig. Create (JSC) withOptions (readOverrides); ) error: Ambiguous method call. Both the create in ReadConfig (JavaSparkContext) and create (JavaSparkContext) in ReadConfig match do you have met?
  • Related