I would like to pass my Google Cloud Platform's service account JSON credentials file to a docker container so that the container can access a cloud storage bucket. So far I tried to pass the file as an environment parameter on the run command like this:
- Using the --env flag:
docker run -p 8501:8501 --env GOOGLE_APPLICATION_CREDENTIALS=/Users/gcp_credentials.json" -t -i image_name
- Using the -e flag and even exporting the same env variable in the command line:
docker run -p 8501:8501 -e GOOGLE_APPLICATION_CREDENTIALS=/Users/gcp_credentials.json" -t -i image_name
But nothing worked, and I always get the following error when running the docker container:
W external/org_tensorflow/tensorflow/core/platform/cloud/google_auth_provider.cc:184] All attempts to get a Google authentication bearer token failed, returning an empty token. Retrieving token from files failed with "Not found: Could not locate the credentials file.".
How to pass the google credentials file to a container running locally on my personal laptop?
CodePudding user response:
You cannot "pass" an external path, but have to add the JSON into the container.
CodePudding user response:
Two ways to do it:
- Volumes: https://docs.docker.com/storage/volumes/
- Secrets: https://docs.docker.com/engine/swarm/secrets/
secrets - work with docker swarm mode.
- create docker secrets
- use secret with a container using --secret
Advantage being, secrets are encrypted. Secrets are decrypted when mounted to containers.