I have simple setup. Lets say I send 50 messages to my SQS but the amount of processed messages is random. Sometimes processes all of them but sometimes around 20 and there was even a one with a 0 I don't why its happening. No errors logged. Looks like lambda for pocessing the queue is not getting triggered for some reason.
That's how I send messages:
for (const record of records) {
await sqs.sendMessage(record).promise().then(response => {
console.log(JSON.stringify(response))
}, err => {
console.log('SQS ERROR: ', err);
})
}
I can see that console log for success response. All of them (50). No errors here.
Any tips that you can give me? Its the same either if its fifo or not. Global timeout is set to 60.
Wanted to add that the same data is being sent. And with the same data sometimes its 20 and sometimtes 50. So its not related.
Cheers.
CodePudding user response:
A means of investigation that you could attempt, would be to create a dead-letter queue. It would allow you to gather more information, and that could be helpful for yourself and others on the forum to troubleshoot the error. Since, in this case we don't know if the Lambda function is the problem or the queue is having transmission issues.
CodePudding user response:
The AWS Lambda function can be triggered with multiple messages from the Amazon SQS queue (unless the Batch Size is set to zero in the Trigger configuration).
The Lambda function should loop through each record that is passed in the event
parameter, for example:
exports.handler = async function(event, context) {
event.Records.forEach(record => {
const { body } = record;
console.log(body);
});
return {};
}