I have a static website consisting of multiple HTML, CSS, and media files uploaded to a private S3 bucket. I would like to access the website using my web browser.
I'm currently using boto3 to generate a presigned URL for a single HTML file. How do I access the entire application without making the bucket public?
s3client = boto3.client('s3')
url = s3client.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': 'my-bucket',
'Key': 'my-website/index.html',
}
) # url shows html content but no CSS styling or media files
CodePudding user response:
To secure your website while connecting from web browser, you will need an SSL certificate
. SSL certificates can not be attached directly to S3 static website, to accomplish this, you can create a CloudFront
distribution which is using s3 bucket as an origin and attach SSL certificate to CloudFront. Create a custom domain Route53 record that is pointing your Cloudfront distribution. You can access your application via browser securely after these steps.
To restrict incoming traffic with certain rule sets, you can attach WAF
(Web Application Firewall) to your CloudFront distribution and deny all IPs/Networks other than yours.
Reference :
https://aws.amazon.com/premiumsupport/knowledge-center/cloudfront-serve-static-website/
https://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/distribution-web-awswaf.html
CodePudding user response:
One possibility is to zip the website and download it to the web server
# import modules
import boto3
import requests
import shutil
import subprocess
# generate presigned url
s3client = boto3.client('s3')
url = s3client.generate_presigned_url(
ClientMethod='get_object',
Params={
'Bucket': 'my-bucket',
'Key': 'my-website.zip',
}
)
# download, unzip, and open in browser
req = requests.get(url)
with open('website.zip', 'wb') as file:
file.write(req.content)
shutil.unpack_archive('website.zip', extract_dir = 'website')
subprocess.call(['open', 'website/index.html'])