Home > database >  Passing arguments to Python file that is run by PySpark
Passing arguments to Python file that is run by PySpark

Time:07-05

This thread here showed how to run Python script file with pyspark. Particularly, this is the command I am using:

% pyspark < script.py

I want to pass an argument (a config file) to this script.py. Normally, running with Python alone, this would work:

% python script.py conf.ini

But with pyspark:

% pyspark < script.py conf.ini

I get the following error message:

Error: pyspark does not support any application options.

Is it possible to do this execution?

CodePudding user response:

Answering this to get it off the unanswered queue. Use spark-submitin combination with sys.argv to get the input:

spark-submit script.py conf.ini
  • Related