Home > Back-end >  How to install pyspark.pandas in Apache Spark?
How to install pyspark.pandas in Apache Spark?

Time:12-02

I downloaded Apache Spark 3.3.0 bundle which contains pyspark

$ pyspark

Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /__ / .__/\_,_/_/ /_/\_\   version 3.3.0
      /_/

Using Python version 3.7.10 (default, Jun  3 2021 00:02:01)
Spark context Web UI available at http://XXX-XXX-XXX-XXXX.compute.internal:4041
Spark context available as 'sc' (master = local[*], app id = local-1669908157343).
SparkSession available as 'spark'.
**>>> import pyspark.pandas as ps**
Traceback (most recent call last):
  File "/home/ec2-user/bin/spark/latest/python/pyspark/sql/pandas/utils.py", line 27, in require_minimum_pandas_version
    import pandas
ModuleNotFoundError: No module named 'pandas'

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "<stdin>", line 1, in <module>
  File "/home/ec2-user/bin/spark/latest/python/pyspark/pandas/__init__.py", line 31, in <module>
    require_minimum_pandas_version()
  File "/home/ec2-user/bin/spark/latest/python/pyspark/sql/pandas/utils.py", line 36, in require_minimum_pandas_version
    ) from raised_error
ImportError: Pandas >= 1.0.5 must be installed; however, it was not found.

How do I import python packages inside Apache-Spark in custom directory like /home/ec2-user/bin/spark/latest/python/pyspark?

I also tried: $ pip install pandas -bash: pip: command not found

If I try to install pip, how can ensure the libraries are compatible with the Python version 3.7.20 in Spark?

CodePudding user response:

Have you tried installing Pandas in the following way:

pip install pyspark[pandas_on_spark]

If the pip is not discoverable by bash, maybe try to active your Python environment first (whether virtualenv, conda or anything else).

  • Related