I created an S3 bucket with the policy below and have set "Block all public access" to "off" and am currently able to download some CSV files I have stored in the bucket into my Jupyter Notebook. That works great but I'm not able to upload to the bucket and what makes things more challenging is that I'm using a federated account and I don't know the security credentials.
I have no way of determining the access key id and secret access key so that I can upload to the bucket. Thanks for any help on how to upload to the bucket!
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "AddPerm",
"Effect": "Allow",
"Principal": "*",
"Action": "s3:GetObject",
"Resource": "arn:aws:s3:::<my-bucket-name>/*"
}
]
}
Here is the sample code I'm using to upload to the S3 bucket.
import boto3
#Creating Session With Boto3.
session = boto3.Session(
aws_access_key_id='<your_access_key_id>',
aws_secret_access_key='<your_secret_access_key>'
)
#Creating S3 Resource From the Session.
s3 = session.resource('s3')
txt_data = b'This is the content of the file uploaded from python boto3 asdfasdf'
object = s3.Object('<bucket_name>', 'file_uploaded_by_boto3.txt')
result = object.put(Body=txt_data)
CodePudding user response:
You need to add "s3:PutObject" to the bucket policy so you can upload the objects.
To get credentials for a federated user, you can go to AWS console and choose "Command line or programmatic access" and then get "aws_access_key_id", "aws_secret_access_key" and "aws_session_token".