I need to import a large csv file into postges using copy.
I find that the processing time is a bit long. So I look at docker stats:
CONTAINER ID NAME CPU % MEM USAGE / LIMIT MEM % NET I/O BLOCK I/O PIDS
682795abfc17 postgres 28.83% 1.034GiB / 15.63GiB 6.62% 23.7kB / 17.7kB 15.5GB / 14.4GB 11
I want it to be able to use more ram than 1Gib so I put this but no improvement:
database:
container_name: postgres_container
image: postgis/postgis:14-3.3
cpus: 4
how to make the container use more ram? thank you
CodePudding user response:
To increase the amount of memory that the Postgres container can use, you can use the --memory flag when starting the container. For example:
docker run --name postgres_container --memory 8g -d postgis/postgis:14-3.3
This will start the Postgres container with 8GB of memory. You can adjust the amount of memory as needed.
It's also possible that the import process is slow due to the size of the CSV file or the number of rows it contains. In this case, you may need to consider using a different import strategy, such as using the COPY command in parallel or using a tool like pg_bulkload to import the data more efficiently.
Finally, you may want to consider optimizing the Postgres configuration to improve performance. You can do this by adjusting parameters in the postgresql.conf file, such as increasing the number of shared buffers or enabling query parallelism. You may also want to consider adding additional CPU and memory resources to the host machine to give the database server more resources to work with.
Now for docker-composer
specify the amount of memory that the Postgres container can use when starting the container using docker-compose, you can include the mem_limit option in the deploy section of your docker-compose.yml file. For example:
version: '3.7'
services:
postgres:
image: postgis/postgis:14-3.3
deploy:
mem_limit: 8g