I am following a tutorial on udemy where I am trying to upload pandas dataframe to s3 bucket as a parquet using boto3. I used BytesIO to convert dataframe to parquet and tried uploading it to my s3 bucket which I made accessible to public. On execution I encounter an error :
Parameter validation failed:
Missing required parameter in input: "Key"
Unknown parameter in input: "key", must be one of: ACL, Body, Bucket, CacheControl, ContentDisposition, ContentEncoding, ContentLanguage, ContentLength, ContentMD5, ContentType, ChecksumAlgorithm, ChecksumCRC32, ChecksumCRC32C, ChecksumSHA1, ChecksumSHA256, Expires, GrantFullControl, GrantRead, GrantReadACP, GrantWriteACP, Key, Metadata, ServerSideEncryption, StorageClass, WebsiteRedirectLocation, SSECustomerAlgorithm, SSECustomerKey, SSECustomerKeyMD5, SSEKMSKeyId, SSEKMSEncryptionContext, BucketKeyEnabled, RequestPayer, Tagging, ObjectLockMode, ObjectLockRetainUntilDate, ObjectLockLegalHoldStatus, ExpectedBucketOwner
I am currently on macOS monterey 12.6.1
Here is the code, df_all is a dataframe :
key = 'xetra_daily_report_' datetime.today().strftime("%Y%m%d_%H%M%S") '.parquet'
out_buffer = BytesIO()
df_all.to_parquet(out_buffer, index = False)
bucket_target = s3.Bucket('name-bucket')
bucket_target.put_object(Body = out_buffer.getvalue(), key = key)
Following is my bucket policy :
`
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": "*",
"Action": [
"s3:PutObject",
"s3:GetObject",
"s3:DeleteObject",
"s3:GetObjectAcl",
"s3:PutObjectAcl"
],
"Resource": [
"arn:aws:s3:::name-bucket",
"arn:aws:s3:::name-bucket/*",
"arn:aws:s3:::name-bucket/ "
]
}
]
}
`
CodePudding user response:
key
should be Key
(python is case sensitive):
bucket_target.put_object(Body = out_buffer.getvalue(), Key = key)