I have an excel file in generated from a Lamba function, stored in the /tmp/ folder, I want to store it in a S3 bucket, I have setup the permisions and the bucket, but when I complete the function it creates an damaged excel file in the bucket, which cannot be opened.
The code I used:
import boto3
def uploadtoS3(filename=str):
s3 = boto3.client('s3')
bucket = 'aws-day-ahead-estimations'
DirName = '/tmp/' filename
s3.put_object(Bucket=bucket,Body=DirName,Key=filename)
print('put complete')
CodePudding user response:
When you use the put_object() method, the Body parameter expects the actual content of the file, not the file path.
You can fix this:
def uploadtoS3(filename=str):
s3 = boto3.client('s3')
bucket = 'aws-day-ahead-estimations'
file_path = '/tmp/' filename
try:
with open(file_path, 'rb') as f:
s3.put_object(Bucket=bucket, Body=f, Key=filename)
print('put complete')
except Exception as e:
print(f"An error occurred: {e}")
Another approach is to use the upload_file() method of the S3 client instead of the put_object() method.
CodePudding user response:
You can use the AWS SDK for Python (Boto3) to interact with S3 from within a Lambda function. Here's an example of how you can upload a file to S3 using a Lambda function:
import boto3
def lambda_handler(event, context):
s3 = boto3.client('s3')
filename = 'example.xlsx'
bucket_name = 'my-bucket'
s3.upload_file(filename, bucket_name, filename)
In this example, filename is the name of the Excel file you want to upload, and bucket_name is the name of the S3 bucket where you want to upload the file.
You can also specify the path of the file to upload using event variable
import boto3
def lambda_handler(event, context):
s3 = boto3.client('s3')
filename = event['file_path']
bucket_name = 'my-bucket'
s3.upload_file(filename, bucket_name, filename)
You also need to make sure that the IAM role associated with the Lambda function has the necessary permissions to access S3.