In Apache Spark 3.3.0 I want to install Prophet to use it with pyspark.
$ pyspark --version
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 3.3.0
/_/
Using Scala version 2.13.8, OpenJDK 64-Bit Server VM, 1.8.0_332
Branch HEAD
Compiled by user ubuntu on 2022-06-09T18:15:33Z
Revision f74867bddfbcdd4d08076db36851e88b15e66556
Url https://github.com/apache/spark
Type --help for more information.
and I have
$ python --version
Python 2.7.18
I'm very new in python, I have tried:
1)
>>> from fbprophet import Prophet
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'fbprophet'
>>> pip install Prophet
File "<stdin>", line 1
pip install Prophet
^
SyntaxError: invalid syntax
>>>
>>> from prophet import Prophet
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
ModuleNotFoundError: No module named 'prophet'
>>>
$ pip install prophet
-bash: pip: command not found
So here the question is how to install prophet in Apache Spark to use with pyspark?
CodePudding user response:
You have to run
pip install prophet
in your terminal (outside Apache Spark), not your Python script.
Once prophet is installed you can refer to it in your Python code. Use:
from prophet import Prophet
fbprophet is the old name of the package, unless you want to use the old release, use prophet instead.