2

This is related to Host a static site on AWS S3 - granting read only access and AWS S3 Bucket Permissions - Access Denied , but since those answers it appears AWS has changed some of the ways to set up static website hosting.

I'm curious if I have access and permissions set up correctly and haven't left any security holes. I can access the static html files in the S3 bucket right now, so web access for the public works, and I upload files in the AWS web interface and not via the shell.

The AWS setup is fairly straightforward for a static site, but I want to check: do I have ACLs and permissions set up correctly for a static site?

1) The Properties Tab is set up for Static Web Site Hosting: "Bucket Hosting."

2) In the Permissions Tab,

a) Public access is:

enter image description here

b) In the Access Control List, there is access for the bucket owner, but no access for other AWS accounts or the public.

c) he Bucket Policy is flagged "Public" and the JSON is standard:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "AddPerm",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::example.com/*"
        }
    ]
}

d) I have set no CORS configuration.

BlueDogRanch
  • 721
  • 1
  • 16
  • 43

1 Answers1

3

Yes, your policy and public access settings are correct for a site that intentionally makes all of its objects publicly accessible (read, but not list/write/delete) via the bucket policy.

Note that as a side effect of these correct settings, S3 will not return 404 Not Found if a nonexistent object is requested. It will instead return 403 Forbidden. There is no real workaround for this, because S3 considers an anonymous requester to be unauthorized to determine why the object can't be downloaded (could be due to non-existence or could be some other reason) unless s3:ListBucket is also granted, and you don't want to do that.

Michael - sqlbot
  • 169,571
  • 25
  • 353
  • 427
  • Thanks! That makes sense. FYI, I actually get the AWS server default 404 page for a non-existent object; and then I added my own 404.html file, and that was used. – BlueDogRanch Mar 10 '19 at 17:47
  • The custom error document is a generic error file, used for all 4XX errors. [*"You can optionally provide a custom error document that contains a user-friendly error message and additional help. You provide this custom error document as part of adding website configuration to your bucket. Amazon S3 returns your custom error document for only the HTTP 4XX class of error codes."*](https://docs.aws.amazon.com/AmazonS3/latest/dev/CustomErrorDocSupport.html) You *should* find that the actual error code returned is 403 rather than 404. – Michael - sqlbot Mar 10 '19 at 19:30
  • ...If not, then I have a concern that you have, somehow, allowed a permission that should not be present. Go to exactly this url: `http://example-bucket.s3.amazonaws.com` substituting only your bucket name, leaving the rest of the URL as shown here, and verify that you do NOT see an XML listing of the objects. You should see `AccessDenied`. – Michael - sqlbot Mar 10 '19 at 19:31
  • I get an AccessDenied for that URL, with a flash of the 404 error first. So it appears that works as intended. – BlueDogRanch Mar 11 '19 at 14:17