Home > other >  Docker FileNotFoundError: [Errno 2] No such file or directory
Docker FileNotFoundError: [Errno 2] No such file or directory

Time:12-08

I'm very new to docker.I'm using docker and pandas to read csv file and insert into db.

Dockerfile:

FROM python:3.8

ADD main.py .

RUN pip install requests pandas sqlalchemy

CMD ["python","./main.py"]

main.py:

Imports

import pandas as pd
from sqlalchemy import create_engine

# This CSV doesn't have a header so pass
# column names as an argument
columns = [
    "open",
    "high",
    "low",
    "close",
    "volume",
    "datetime",
]

# Load in the data
df = pd.read_csv(r"/Users/myname/Downloads/file511282022.csv",names=columns)

# Instantiate sqlachemy.create_engine object
engine = create_engine('postgresql://postgres:mypassword@localhost:5432/postgres')

# Save the data from dataframe to
# postgres table "iris_dataset"
df.to_sql(
    'file5', 
    engine,
#     index=False # Not copying over the index
)

error:

FileNotFoundError: [Errno 2] No such file or directory: '/Users/myname/Downloads/file511282022.csv'

the pd.read_csv code works out of docker in jupter notebook ,with the same code:

pd.read_csv(r"/Users/myname/Downloads/file511282022.csv",names=columns)

CodePudding user response:

Need to add csv file also inside docker because DOcker is having it's own file path you can not access local files(except added in docker run command). So, you need to add that file also inside docker same as main.py file.

FROM python:3.8

ADD main.py .
ADD file511282022.csv .

RUN pip install requests pandas sqlalchemy

CMD ["python","./main.py"]

and

import pandas as pd
from sqlalchemy import create_engine

# This CSV doesn't have a header so pass
# column names as an argument
columns = [
    "open",
    "high",
    "low",
    "close",
    "volume",
    "datetime",
]

# Load in the data
df = pd.read_csv("file511282022.csv",names=columns)

# Instantiate sqlachemy.create_engine object
engine = create_engine('postgresql://postgres:mypassword@localhost:5432/postgres')

# Save the data from dataframe to
# postgres table "iris_dataset"
df.to_sql(
    'file5', 
    engine,
#     index=False # Not copying over the index
)

Also you need to change engine = create_engine('postgresql://postgres:mypassword@localhost:5432/postgres')

use docker network for that .

  • Related