I have the following project structure
client folder
package.json
Dockerfile
server folder
package.json
Dockerfile
client needs a npm run devserver
command and
server needs a npm run develop
command. Both these commands simultaneously in 2 different terminals runs application in my local
The client
folder uses some files which are present in server
folder while running the devserver
command. Now if I create separate dockerfiles in client
and server
folder. The devserver
command wont be able to access files in server
folder. And hence Im unable to start my application.
Is there any way I can access the files using dockerisation ? Maybe using docker-compose too not able to figure out.
CodePudding user response:
You can put docker-compose.yaml
in the same folder of server
and client
, and specify build context
as .
, while afford an additional dockerfile
option to meet your requirement, example as next:
structure:
root@pie:~/20221015# tree
.
├── client
│ └── Dockerfile
├── docker-compose.yaml
└── server
└── file_in_server
└── Dockerfile
2 directories, 4 files
docker-compose.yaml:
version: "3.7"
services:
client:
image: client_image
build:
context: .
dockerfile: client/Dockerfile
client Dockerfile:
FROM alpine
COPY server/file_in_server /tmp
execution:
root@pie:~/20221015# docker-compose build --no-cache
Building client
Step 1/2 : FROM alpine
---> 9c6f07244728
Step 2/2 : COPY server/file_in_server /tmp
---> c5cc162bad75
Successfully built c5cc162bad75
Successfully tagged client_image:latest
root@pie:~/20221015# docker run --rm -it client_image ls /tmp/file_in_server
/tmp/file_in_server
You could see the client dockerfile
successfully access the file in server folder
, you can move to build-definition if you want to dig more.