I am looking to create a lambda which will constantly monitor my directory in the s3 bucket and will notify if the objects resides in the bucket for more than 1 hour after the upload. We already have a lambda to archive the object after processing which is invoked based on s3 events. But I need a function to calculate the time of the object and notify.
I tried something like this. Doesn't help.
import boto3
import datetime
s3=boto3.resource('s3')
my_bucket=s3.Bucket('mybucketname')
def obj_age(s3):
modified=s3.ObjectSummary('last_modified')
return modified
modified=obj_age(s3)
if len(obj_age)==modified-datetime.timedelta(hours=1):
return ["\nFound %i files more than %s hours old in %s." % (len(old_age), 1, bucket.name))]
CodePudding user response:
To monitor your bucket you could:
- Create an AWS Lambda function that lists the contents of the bucket, looks for any object that is more than an hour old and sends a message to an Amazon SNS Topic with details of the object(s) found
- Subscribe to the SNS Topic to receive an email when a message is sent to the Topic
- Create an Amazon CloudWatch Events rule to trigger the Lambda function once per hour
Here is an example Lambda function that will check for any objects more than an hour old, and send a message to an Amazon SNS Topic for any found:
import boto3
from datetime import datetime, timedelta
from dateutil.tz import tzutc, UTC
BUCKET = 'my-bucket'
TOPIC_ARN = 'arn:aws:sns:ap-southeast-2:123456789012:Old-File-Warning'
s3_resource = boto3.resource('s3')
sns_resource = boto3.resource('sns')
sns_topic = sns_resource.Topic(TOPIC_ARN)
for object in s3_resource.Bucket(BUCKET).objects.all():
if object.last_modified > datetime.now(tzutc()) - timedelta(hours = 1):
message = f"Object {object.key} is more than 1 hour old!"
sns_topic.publish(Message=message)
Of course, you should actually investigate the reason why your initial Lambda function is failing, which would be much simpler than writing and maintaining this 'check' function.