I'm trying to keep one of our website assets in Gitlab. In every merge request, I'll push changes to S3 and invalidate to CloudFront cache. This is the story of the problem. Note: I tried this method on static websites and works perfectly.
First Try: I tried to sync changes with --exclude and --include filters.
aws s3 sync . s3://$BUCKET_NAME --exclude '*' --include 'assets/*' --include 'css/*' --include 'files/*'
Result: Working perfectly local environment. But in the pipeline, tried to sync everything and in the end get an error. (Tried timeout, and debug flags as well, but nothing changed. I cannot skip the error) Last I noticed, the CLI keeps track of the modification date of the files. I decided to try with --size-only flag.
Second Try:
aws s3 sync . s3://S3_BUCKET --size-only
Result:
fatal error: An error occurred (AccessDenied) when calling the ListObjectsV2 operation: Access Denied
The User has s3 full access (Testing purposes). Also, the bucket has a policy;
{
"Sid": "2",
"Effect": "Allow",
"Principal": {
"AWS": "arn:aws:iam::**58**439**:user/user"
},
"Action": "s3:*",
"Resource": [
"arn:aws:s3:::bucket",
"arn:aws:s3:::bucket/*"
]
}
So, I'm stuck at this point.
Any help is appreciated.
CodePudding user response:
I finally found the source of the problem. file names contained Unicode characters.
This issue page helped me a lot. https://github.com/aws/aws-cli/issues/1437
In my gitlab-ci.yalm file, I just added these lines.
variables:
LC_ALL: en_US.UTF-8
And everything is back to normal.