Home > Blockchain >  Lambda trigger dynamic specific path s3 upload
Lambda trigger dynamic specific path s3 upload

Time:01-31

I am trying to create a lambda function that will get triggered once a folder is uploaded to a S3 Bucket. But the lambda will perform an operation that will save files back on the same folder, how can I do so without having a self calling function?

I want to upload the following folder structure to the bucket:

Project_0001/input/inputs.csv

The outputs will create and be saved on:

Project_0001/output/outputs.csv

But, my project number will change, so I can't simply assign a static prefix. Is there a way of dynamically change the prefix, something like:

Project_*/input/

CodePudding user response:

From Shubham's comment I drafted my solution using the prefix and sufix.

For my case, I stated the prefix being 'Project_' and for the suffix I choose one specific file for the trigger, so my suffix is '/input/myFile.csv'.

So every time I upload the structure Project_/input/allmyfiles_with_myFile.csv it triggers the function and then I save my output in the same project folder, under the output folder, thus not triggering the function again.

I get project name with the following code

key = event['Records'][0]['s3']['object']['key']
project_id = key.split("/")[0]
  • Related