I have a Python 3.7 Lambda function that invalidates CloudFront cache:
from __future__ import print_function
import boto3
import time
def lambda_handler(event, context):
print(event)
for items in event["Records"]:
path = "/" items["s3"]["object"]["key"]
print(path)
client = boto3.client('cloudfront')
invalidation = client.create_invalidation(DistributionId='distributionid__ID',
InvalidationBatch={
'Paths': {
'Quantity': 1,
'Items': [path]
},
'CallerReference': str(time.time())
})
This is the attached IAM policy:
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
],
"Resource": "arn:aws:logs:*:*:*"
},
{
"Effect": "Allow",
"Action": [
"cloudfront:CreateInvalidation"
],
"Resource": [
"*"
]
}
]
}
This is my test JSON:
{
"key1": "*"
}
Whenever I invoke my Lambda function, I get this error:
{'key 1': '*'}
[ERROR] KeyError: 'Records'
Traceback (most recent call last):
Version: $LATEST
File "/var/task/lambda_function.py", line 8, in lambda_handler
for items in event["Records"]:
CodePudding user response:
The issue here is your input JSON - the code is expecting an Amazon S3 Event Notification.
Your input JSON does not have the correct structure as per docs, for an event message.
The Lambda errors as for items in event["Records"]:
requires a Records
property which is an array. You don't have this property at all in your JSON, resulting in the KeyError
in the logs.
The below barebones test event should work, keeping in mind that you would never get *
in real S3 event notifications.
{"Records":[{"s3":{"object":{"key":"*"}}}]}