I'm confused about how best to manage the following situation where multiple modules within a python package have the same dependencies.
Say I have a package foo
whose file structure is
foo/
__init__.py
bar.py
baz.py
and both bar.py
and baz.py
both require some other package, say external_pkg
, so that they read
# foo/bar.py
import external_pkg
def bar_fn(x):
# do something with external_pkg
and
# foo/baz.py
import external_pkg
def baz_fn(x):
# do something with external_pkg
Question: Is there a way to refactor the package so that we can instead just include the import external_pkg
line within __init__.py
, rather than repeatedly writing it in the modules? Naively removing import external_pkg
from the modules and placing it in __init__.py
leads to NameError: name 'external_pkg' is not defined
errors.
This is obviously not a big problem in the above example, but I'm trying to write a small package where there are more modules and many more common imports involved. I'm aware this might also just be a package design problem on my end. I have tried to google this, but can't seem to find the right combination of terms to get a helpful answer.
CodePudding user response:
foo/__init__.py:
import numpy as np
foo/bar.py:
from . import * # or from foo import *
print(np.zeros(10))
Running (needs to be outside the foo directory for this to work):
$ python -m foo.bar
Output:
[0. 0. 0. 0. 0. 0. 0. 0. 0. 0.]