Using bref as PHP serverless framework.
Problem:
Unable to upload the dynamically generated file to S3 bucket from Lambda using PHP
Code:
serverless.yml
service: lambdaToS3
provider:
name: aws
region: ap-southeast-2
runtime: provided.al2
#stage: prod
profile: default
iam:
role:
statements:
- Effect: Allow
Action:
- 's3:GetObject'
- 's3:PutObject'
- 's3:GetObjectAcl'
- 's3:PutObjectAcl'
Resource:
- 'arn:aws:s3:::*/*'
plugins:
- ./vendor/bref/bref
functions:
lambdaToS3:
handler: index.php
description: ''
layers:
- ${bref:layer.php-74}
events:
- httpApi: '*'
# Exclude files from deployment
package:
patterns:
- '!tests/**'
- '!tmp/**'
index.php
comments are output when executing a file using HTTP GateWay
In this file, I am generating a simple text file in /tmp/
folder and trying to upload it to the S3 bucket with two different attributes, SourceFile
or Body
. None of them working.
<?php declare(strict_types=1);
ini_set('display_errors', "1");
ini_set('display_startup_errors', "1");
error_reporting(E_ALL);
require __DIR__ . '/vendor/autoload.php';
use Aws\S3\S3Client;
use Bref\Logger\StderrLogger;
$logger = new StderrLogger();
$s3client = new S3Client([
'scheme' => 'http',
'version' => '2006-03-01',
'region' => 'ap-southeast-2',
'output' => 'JSON'
]);
$bucket = 'bucketname';
$file = "/tmp/newfile.txt";
$myfile = fopen($file, "w") or die("Unable to open file!");
$txt = "John Doe\n";
fwrite($myfile, $txt);
$txt = "Jane Doe\n";
fwrite($myfile, $txt);
fclose($myfile);
echo "<pre>"; print_r("File Exist: ". file_exists($file)); echo "</pre>"; // 1
echo "<pre>"; print_r(" ================ "); echo "</pre>";
echo "<pre>"; print_r(file_get_contents($file)); echo "</pre>"; // we can get content using this method.
echo "<pre>"; print_r(" ================ "); echo "</pre>";
// Put on S3 using SourceFile
$result = $s3client->putObject([
'Bucket' => $bucket,
'Key' => "AWS_LAMBDA_S3.txt",
'SourceFile' => $file,
'ACL' => 'public-read',
]);
echo "<pre>"; print_r("File Upload: " . json_encode($result)); echo "</pre>";
// Put on S3 using Body
$myfile = fopen($file, "rb");
$result = $s3client->putObject([
'Bucket' => $bucket,
'Key' => "AWS_LAMBDA_S3_BODY.txt",
'Body' => $myfile,
'ACL' => 'public-read',
]);
echo "<pre>"; print_r("File Upload: " . json_encode($result)); echo "</pre>";
exit;
Regarding policy: Once the function deploy to AWS Lambda, we got a following policy for the role
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": [
"logs:CreateLogStream",
"logs:CreateLogGroup"
],
"Resource": "arn:aws:logs:ap-southeast-2:xxxx:log-group:/aws/lambda/lambdaToS3-dev*:*"
},
{
"Sid": "VisualEditor1",
"Effect": "Allow",
"Action": [
"s3:PutObject",
"s3:GetObjectAcl",
"s3:PutObjectAcl"
],
"Resource": "*"
},
{
"Sid": "VisualEditor2",
"Effect": "Allow",
"Action": [
"sqs:DeleteMessage",
"sqs:ReceiveMessage",
"sqs:GetQueueAttributes",
"logs:PutLogEvents"
],
"Resource": [
"arn:aws:logs:ap-southeast-2:xxxx:log-group:/aws/lambda/lambdaToS3-dev*:*:*",
"arn:aws:sqs:ap-southeast-2:xxxx:xxxx"
]
}
]
}
CodePudding user response:
The above solution will work.
The problem was with VPC: If your lambda function is inside VPC, then you need to enable the endpoint of S3 from the VPC dashboard.
You can refer to any video in any other programming language showing how to deploy the lambda function in VPC and access S3.