I currently have a React application that I have a AWS CodePipeline set up for that does the following.
- Detect changes in GitHub repository
- Build the "build" files (with CodeBuild) using buildspec.yaml file
- Push "build" files to S3 bucket
The S3 bucket is configured to serve the static files to my domain.
This setup is great because it's cheap, I don't need to have an EC2 server always up and running serving these static files, which is completely unnecessary.
Recently however I've Dockerized this application, which is fantastic for me when I'm working on it from different machines.
However now that it's Dockerized it seems like it would be a better idea to have a docker container build the "build" files and push them to the S3 bucket, to ensure that the files being built on my machine are identical to the ones being pushed to the S3 Bucket.
Ideally I would like to have this all be automated when I push to the repo like it currently is.
I've seen a lot of tutorials about how to automate the creation of docker images getting pushed to AWS ECR and then using ECS (Fargate) to run the container. However to me this is just the same thing as running my app on an EC2 server... why do I want to do all this and then have a container continuously running on a server? Now it would just be a ECS server...
So what I am asking is, how can I create an automated CI/CD pipeline that builds the static files using a docker container, and then pushes them to S3, as I currently have it?
Here is current CodeBuild buildspec.yaml file for reference
version: 0.2
phases:
install:
runtime-versions:
nodejs: 12
commands:
# install yarn
- npm install yarn
# install dependencies
- yarn
# so that build commands work
- yarn add eslint-config-react-app
build:
commands:
# run build script
- yarn build
artifacts:
# include all files required to run application
# we include only the static build files
files:
- '**/*'
base-directory: 'build'
CodePudding user response:
To automate the build process of your Dockerized React application, you can use the same AWS CodePipeline you currently have set up, but you will need to make some changes to the CodeBuild project to use a Docker build environment instead of the default one.
To do this, you will need to first create a Dockerfile for your React application that will be used to build the Docker image for your application. This Dockerfile should include all the necessary instructions for building your application, such as copying the source code into the image and installing any dependencies.
Once you have created the Dockerfile, you can update your buildspec.yaml file to use a Docker build environment and specify the Dockerfile to be used. The updated buildspec.yaml file might look something like this:
version: 0.2
phases:
install:
runtime-version:
docker: 19
commands:
# build the Docker image for the application
- docker build -t my-react-app:latest -f Dockerfile .
build:
commands:
# run the Docker container with the built image to build the static files
- docker run my-react-app:latest yarn build
artifacts:
files:
- '**/*'
base-directory: 'build'
In this example, the build phase will build the Docker image for your application using the instructions in the Dockerfile, and then run the container to build the static files for your React application.
Once you have updated your buildspec.yaml file, you can commit the changes to your GitHub repository and the CodePipeline will automatically detect the changes and run the updated build process using the Docker build environment. This will ensure that the static files built by the Docker container are identical to the ones pushed to the S3 bucket.
It is important to note that in order to use a Docker build environment in CodeBuild, you will need to use a build image that includes the Docker runtime, such as the aws/codebuild/standard:4.0 image. You can specify the build image to use in the CodeBuild project settings.