Home > Net >  Cannot modify the value of a Spark config: spark.executor.instances
Cannot modify the value of a Spark config: spark.executor.instances

Time:06-17

I am using spark 3.0 and I am setting parameters

My parameters:

spark.conf.set("fs.s3a.impl", "org.apache.hadoop.fs.s3a.S3AFileSystem")
spark.conf.set("fs.s3a.fast.upload.buffer", "bytebuffer")
spark.conf.set("spark.sql.files.maxPartitionBytes",134217728)
spark.conf.set("spark.executor.instances", 4)
spark.conf.set("spark.executor.memory", 3) 

Error:

pyspark.sql.utils.AnalysisException: Cannot modify the value of a Spark config: spark.executor.instances

I DONT want to pass it through spark-submit as this is pytest case that I am writing.

How do I get through this?

CodePudding user response:

You can try to add those option to PYSPARK_SUBMIT_ARGS before initialize SparkContext. Its syntax is similar to spark-submit.

CodePudding user response:

According to spark official documentation, the spark.executor.instances property may not be affected when setting programmatically through SparkConf in runtime, so it would be suggested to set through configuration file or spark-submit command line options.

Spark properties mainly can be divided into two kinds: one is related to deploy, like “spark.driver.memory”, “spark.executor.instances”, this kind of properties may not be affected when setting programmatically through SparkConf in runtime, or the behavior is depending on which cluster manager and deploy mode you choose, so it would be suggested to set through configuration file or spark-submit command line options; another is mainly related to Spark runtime control, like “spark.task.maxFailures”, this kind of properties can be set in either way.

  • Related