Home > database >  Where to modify spark-defaults.conf if I installed pyspark via pip install pyspark
Where to modify spark-defaults.conf if I installed pyspark via pip install pyspark

Time:12-07

I installed pyspark 3.2.0 via pip install pyspark. I have installed pyspark in a conda environment named pyspark. I cannot find spark-defaults.conf. I am searching for it in ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark since that is my understanding of what SPARK_HOME should be.

  1. Where can I find spark-defaults.conf? I want to modify it
  2. Am I right in setting SPARK_HOME to the installation location of pyspark ~/miniconda3/envs/pyspark/lib/python3.9/site-packages/pyspark?

CodePudding user response:

2. The SPARK_HOME environment variables are configured correctly.

1. In the pip installation environment, the $SPARK_HOME/conf directory needs to be created manually, then copy the configuration file template to this directory and modify each configuration file.

  • Related