Home > Net >  Running a single spark for all pytest cases
Running a single spark for all pytest cases

Time:06-18

I have a folder structure as below /test

----test_abc.py
----test_bcd.py
 ----test_cde.py
----conftest.py

conftest.py contains all spark-initiation items and in each pytest file, I use these fixtures.

For executing all pytest files, I am writing a shell script that will call all tests as shown below. Is this the correct way? I think if I call these files individually as below, it will initiate a new spark session every time I execute. Is my assumption right, Can I use the same spark session for all pytests?

bashscript.sh

pytest ./tests/test_abc.py --emr 
pytest ./tests/test_bcd.py --emr 
pytest ./tests/test_cde.py --emr 

CodePudding user response:

If you want to create a single pytest session but only call a few files, you can pass them in as positional arguments to pytest:

pytest ./tests/test_abc.py ./tests/test_bcd.py ./tests/test_cde.py --emr

In this way, session scoped fixtures will only be created once.

  • Related