Problem
Hi, I wanted to add metadata(CacheControl: 'max-age=3600;s-maxage=3600') whenever new file uploaded in S3. So, I made lambda code which triggered by S3 PUT. However, metadata(CacheControl) is not added in uploaded File even though code does not have error.. Could you help me :(
[enter image description here][1] [1]: https://i.stack.imgur.com/8MfbF.png
My lambda code is here
async function retrieveNewFile(event){
const bucket = event.Records[0].s3.bucket.name;
const key = decodeURIComponent(event.Records[0].s3.object.key.replace(/\ /g, ' '));
const params = {
Bucket: bucket,
Key: key
}
console.log('* OriginData: ' JSON.stringify(params));
return params;
}
async function addCacheControl(existedData){
existedData.CopySource = existedData.Bucket '/' existedData.Key;
existedData.CacheControl = 'max-age=3600;s-maxage=3600';
//existedData['x-amz-metadata-directive'] = 'replace';
console.log(existedData);
await s3.copyObject(existedData).promise();
return existedData;
}
** Note I tried using 'putObject' instead of 'copyObject' but if I use 'putObject', code is in loop because of 'PUT' which triggered my lambda. (I cannot split the directory for this. so I want to use 'copyObject' or something...)
CodePudding user response:
I modified function not to use 's3.copyobject', but use 's3.headObject' :)
async function checkHeaderExist(file){
const header= await s3.headObject(file).promise();
console.log(header);
if(header.CacheControl){
return 'exist';
}else return 'no exist';
}
CodePudding user response:
Looking around I found a similar post [1]. The solution could be to check if the metadata has been updated and only update if it has not. This way, the first time the object is uploaded it triggers the lambda and puts a new object; the second time it os triggered it checks if the object is updated and closes the lambda.
[1] AWS Lambda function and S3 - change metadata on an object in S3 only if the object changed