I'm trying to setup a step function with an EKS run
job. The job kicks off a pod in an EKS cluster and execute commands. As a start, I want the command to be echo $S3_BUCKET $S3_KEY
, where both $S3_BUCKET
and $S3_KEY
are environment variables passed in from the step function input. Here is the container spec:
"containers": [
{
"name": "my-container-spec",
"image": "****.dkr.ecr.****.amazonaws.com/****:latest",
"command": [
"echo"
],
"args": [
"$S3_BUCKET", "$S3_KEY"
],
"env": [
{
"name": "S3_BUCKET",
"value.$": "$.s3_bucket"
},
{
"name": "S3_KEY",
"value.$": "$.s3_key"
}
]
}
],
"restartPolicy": "Never"
Unfortunately, after the job is executed, the command only echo the raw test $S3_BUCKET $S3_KEY
instead of the passed in value.
So the question here is how I should pass in an environment variable as an args
. The environment variable doesn't have to be passed in, it could be other inherited variables.
CodePudding user response:
This will do the trick: args: ["$(S3_BUCKET), $(S3_KEY)"]