55

I'm trying to get django to upload static files to S3, but istead I'm getting a 403 forbidden error, and I'm not sure why.

Full Stacktrace:

Traceback (most recent call last):
  File "manage.py", line 14, in <module>
    execute_manager(settings)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/__init__.py", line 438, in execute_manager
    utility.execute()
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/__init__.py", line 379, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/base.py", line 191, in run_from_argv
    self.execute(*args, **options.__dict__)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/base.py", line 220, in execute
    output = self.handle(*args, **options)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/core/management/base.py", line 351, in handle
    return self.handle_noargs(**options)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 89, in handle_noargs
    self.copy_file(path, prefixed_path, storage, **options)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 184, in copy_file
    if not self.delete_file(path, prefixed_path, source_storage, **options):
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/django/contrib/staticfiles/management/commands/collectstatic.py", line 115, in delete_file
    if self.storage.exists(prefixed_path):
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/storages/backends/s3boto.py", line 209, in exists
    return k.exists()
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/boto/s3/key.py", line 391, in exists
    return bool(self.bucket.lookup(self.name))
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/boto/s3/bucket.py", line 143, in lookup
    return self.get_key(key_name, headers=headers)
  File "/home/levi/Projects/DoneBox/.virtualenv/local/lib/python2.7/site-packages/boto/s3/bucket.py", line 208, in get_key
    response.status, response.reason, '')
boto.exception.S3ResponseError: S3ResponseError: 403 Forbidden

Contents of settings.py:

import os
DIRNAME = os.path.dirname(__file__)
# Django settings for DoneBox project.

DEBUG = True
TEMPLATE_DEBUG = DEBUG

ADMINS = (
    # ('Your Name', 'your_email@example.com'),
)

MANAGERS = ADMINS

DATABASES = {
    'default': {
        'ENGINE': 'django.db.backends.sqlite3', # Add 'postgresql_psycopg2', 'postgresql', 'mysql', 'sqlite3' or 'oracle'.
        'NAME': os.path.join(DIRNAME, "box.sqlite"),                      # Or path to database file if using sqlite3.
        'USER': '',                      # Not used with sqlite3.
        'PASSWORD': '',                  # Not used with sqlite3.
        'HOST': '',                      # Set to empty string for localhost. Not used with sqlite3.
        'PORT': '',                      # Set to empty string for default. Not used with sqlite3.
    }
}

# Local time zone for this installation. Choices can be found here:
# http://en.wikipedia.org/wiki/List_of_tz_zones_by_name
# although not all choices may be available on all operating systems.
# On Unix systems, a value of None will cause Django to use the same
# timezone as the operating system.
# If running in a Windows environment this must be set to the same as your
# system time zone.
TIME_ZONE = 'America/Denver'

# Language code for this installation. All choices can be found here:
# http://www.i18nguy.com/unicode/language-identifiers.html
LANGUAGE_CODE = 'en-us'

SITE_ID = 1

# If you set this to False, Django will make some optimizations so as not
# to load the internationalization machinery.
USE_I18N = True

# If you set this to False, Django will not format dates, numbers and
# calendars according to the current locale
USE_L10N = True

# Absolute filesystem path to the directory that will hold user-uploaded files.
# Example: "/home/media/media.lawrence.com/media/"
MEDIA_ROOT = ''

# URL that handles the media served from MEDIA_ROOT. Make sure to use a
# trailing slash.
# Examples: "http://media.lawrence.com/media/", "http://example.com/media/"
MEDIA_URL = "d1eyn4cjl5vzx0.cloudfront.net"

# Absolute path to the directory static files should be collected to.
# Don't put anything in this directory yourself; store your static files
# in apps' "static/" subdirectories and in STATICFILES_DIRS.
# Example: "/home/media/media.lawrence.com/static/"
STATIC_ROOT = os.path.join(DIRNAME, "static")

# URL prefix for static files.
# Example: "http://media.lawrence.com/static/"
STATIC_URL = "d280kzug7l5rug.cloudfront.net"

# URL prefix for admin static files -- CSS, JavaScript and images.
# Make sure to use a trailing slash.
# Examples: "http://foo.com/static/admin/", "/static/admin/".
ADMIN_MEDIA_PREFIX = '/static/admin/'

# Additional locations of static files
STATICFILES_DIRS = (
    # Put strings here, like "/home/html/static" or "C:/www/django/static".
    # Always use forward slashes, even on Windows.
    # Don't forget to use absolute paths, not relative paths.
    os.path.join(DIRNAME, "main", "static"),
)

# List of finder classes that know how to find static files in
# various locations.
STATICFILES_FINDERS = (
    'django.contrib.staticfiles.finders.FileSystemFinder',
    'django.contrib.staticfiles.finders.AppDirectoriesFinder',
    'django.contrib.staticfiles.finders.DefaultStorageFinder',
)

# Make this unique, and don't share it with anybody.
SECRET_KEY = '<snip>'

# List of callables that know how to import templates from various sources.
TEMPLATE_LOADERS = (
    'django.template.loaders.filesystem.Loader',
    'django.template.loaders.app_directories.Loader',
    'django.template.loaders.eggs.Loader',
)

MIDDLEWARE_CLASSES = (
    'django.middleware.common.CommonMiddleware',
    'django.contrib.sessions.middleware.SessionMiddleware',
    'django.middleware.csrf.CsrfViewMiddleware',
    'django.contrib.auth.middleware.AuthenticationMiddleware',
    'django.contrib.messages.middleware.MessageMiddleware',
)

ROOT_URLCONF = 'DoneBox.urls'

TEMPLATE_DIRS = (
    # Put strings here, like "/home/html/django_templates" or "C:/www/django/templates".
    # Always use forward slashes, even on Windows.
    # Don't forget to use absolute paths, not relative paths.
    os.path.join(DIRNAME, "main", "templates"),
    os.path.join(DIRNAME, "templates"),
    os.path.join(DIRNAME, "basic", "blog", "templates"),
)

INSTALLED_APPS = (
    'django.contrib.auth',
    'django.contrib.contenttypes',
    'django.contrib.sessions',
    'django.contrib.sites',
    'django.contrib.messages',
    'django.contrib.staticfiles',
    'django.contrib.sitemaps',
    # Uncomment the next line to enable the admin:
    'django.contrib.admin',
    # Uncomment the next line to enable admin documentation:
    'storages',
    'django.contrib.admindocs',
    'main',
    'contacts',
    'piston',
    'registration',
#    'contact_form',
    'basic',
    'basic.blog',
)

# A sample logging configuration. The only tangible logging
# performed by this configuration is to send an email to
# the site admins on every HTTP 500 error.
# See http://docs.djangoproject.com/en/dev/topics/logging for
# more details on how to customize your logging configuration.
LOGGING = {
    'version': 1,
    'disable_existing_loggers': False,
    'handlers': {
        'mail_admins': {
            'level': 'ERROR',
            'class': 'django.utils.log.AdminEmailHandler'
        }
    },
    'loggers': {
        'django.request': {
            'handlers': ['mail_admins'],
            'level': 'DEBUG',
            'propagate': True,
        },
        'django.db.backends': {
            'handlers': ['mail_admins'],
            'level': 'DEBUG',
            'propagate': True,
        }
    }
}

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_ACCESS_KEY_ID = '<snip>'
AWS_SECRET_ACCESS_KEY = '<snip>'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
AWS_STORAGE_BUCKET_NAME = "donebox-static"
STATIC_FILES_BUCKET = "donebox-static"
MEDIA_FILES_BUCKET = "donebox-media"
ACCOUNT_ACTIVATION_DAYS = 7

EMAIL_HOST = "email-smtp.us-east-1.amazonaws.com"
EMAIL_HOST_USER = '<snip>'
EMAIL_HOST_PASSWORD = '<snip>'
EMAIL_PORT = 587
EMAIL_USE_TLS = True
TEMPLATE_CONTEXT_PROCESSORS = (
    "django.contrib.auth.context_processors.auth",
     "django.core.context_processors.debug",
     "django.core.context_processors.i18n",
     "django.core.context_processors.media",
     "django.core.context_processors.static",
     "django.contrib.messages.context_processors.messages",
     "DoneBox.main.context_processors_PandC",
     )

Contents of requirements.pip:

django==1.3
django-storages==1.1.4
django-registration==0.8
django-piston==0.2.3
django-tagging==0.3.1
django-extensions==0.8
BeautifulSoup==3.2.1
boto==2.4.1
mysql-python==1.2.3
tweepy==1.9
feedparser==5.1.2
pycrypto==2.6

A google search for this exception doesn't turn up anything interesting. I suspect I mis-configured things, although I'm not sure. Could someone point me in the right direction? Thank you for your time and consideration.

Levi Campbell
  • 6,005
  • 7
  • 40
  • 47

9 Answers9

114

I'm using Amazon IAM for the particular key ID and access key and just bumped into the same 403 Forbidden... Turns out you need to give permissions that target both the bucket root and its subobjects:

{
  "Statement": [
    {
      "Principal": {
          "AWS": "*"
      },
      "Effect": "Allow",
      "Action": "s3:*",
      "Resource": ["arn:aws:s3:::bucket-name/*", "arn:aws:s3:::bucket-name"]
    }
  ]
}
Karolis.sh
  • 1,618
  • 20
  • 20
AKX
  • 152,115
  • 15
  • 115
  • 172
  • 7
    Is there a way to not give soo many permissions? I'm trying to limit to the bare minimum and not finding enough information. Specifically I want to forbit `list` permissions – KVISH May 21 '14 at 09:26
  • I'd spent hours searching for this. Happy. – Andrew E Nov 09 '14 at 07:54
  • 1
    so... where to put this? – WeaselFox May 19 '15 at 09:55
  • 1
    @WeaselFox Go into the bucket, then click properties, then click edit bucket policy. – Prisoner Aug 25 '15 at 14:42
  • 4
    Beware! the example above is unsecure! setting such wide permissions (wilcard to `Principal` and `s3:*`) is exremely unsafe, you're allowing anyone to create, update and even delete objects stored in your S3 bucket. http://docs.aws.amazon.com/AmazonS3/latest/dev/s3-bucket-user-policy-specifying-principal-intro.html – David M. Nov 15 '16 at 15:41
  • @DavidM. This is meant to be used in an user policy, not a bucket policy. – AKX Nov 16 '16 at 13:24
51

I would recommend that you try to test your AWS credentials separately to verify whether the credentials do actually have permission to read and write data to the S3 bucket. The following should work:

>>> import boto
>>> s3 = boto.connect_s3('<access_key>', '<secret_key>')
>>> bucket = s3.lookup('donebox-static')
>>> key = bucket.new_key('testkey')
>>> key.set_contents_from_string('This is a test')
>>> key.exists()
>>> key.delete()

You should try the same test with the other bucket ('donebox-media'). If this works, the permissions are correct and the problem lies in the Django storages code or configuration. If this fails with a 403 then either:

  • The access_key/secret_key strings are incorrect
  • The access_key/secret_key are correct but that account doesn't have the necessary permissions to write to the bucket

I hope that helps. Please report back your findings.

Danra
  • 9,546
  • 5
  • 59
  • 117
garnaat
  • 44,310
  • 7
  • 123
  • 103
  • So good. Made me find out that besides giving permissions on the bucket I also needed to give permission on its contents buckets/*. – Alper Jun 06 '14 at 17:34
  • AttributeError: 'NoneType' object has no attribute 'new_key' – itsji10dra Jul 22 '16 at 10:08
  • 2
    I think that means the ``s3.lookup()`` call is not finding the bucket you are looking for and is returning a value of None. – garnaat Jul 22 '16 at 12:50
  • 1
    @RoNiT you can try s3.get_bucket(' – Adriano Silva Dec 31 '16 at 14:23
  • To save others a little effort, and to define further what "The following should work:" means: set_contents_from_string() returns an integer, number of bytes written, i.e. if you get +integer output, it worked; key.exists() returns true, but I didn't see a new key added in AWS S3 console, but running key.exists() again after key.delete() seems to verify it existed and was deleted. So, it might be good to add a second key.exists() to the end of the procedure above, and to note what set_contents_from_string() will output. – GG2 Jun 08 '20 at 22:31
  • helped me find out that I'm using the wrong aws profile – Yitzchak Jul 27 '21 at 05:22
46

I had the same problem and finally discovered that the real problem was the SERVER TIME. It was misconfigured and AWS responds with a 403 FORBIDDEN.

Using Debian you can autoconfigure using NTP:

ntpdate 0.pool.ntp.org

Alan Wagner
  • 2,230
  • 1
  • 16
  • 13
  • 3
    THANK YOU. Exact same thing happened to me. Time was off because I was running inside a VM that I had kept suspending, and hadn't set up ntp – wjin Jun 11 '14 at 13:29
  • 3
    Just now saw this after I came back to post the same thing. Followed @garnaat test tutorial and when I launched the command `s3.get_all_buckets()` received a 403 with some xml stating `RequestTimeTooSkewedThe difference between the request time and the current time is too large.`. Wish I´d seen this answer earlier. Cheers anyhow! – Gesias Jun 12 '14 at 08:18
  • 1
    This is probably not the root problem that most people who land here have, but in my edge case it was -- and ntpdate saved my butt. **Thank you!** – palewire Nov 13 '14 at 17:58
  • 2
    My exact issue running in a VM. – djangoat May 04 '16 at 17:25
  • 1
    That was my problem too! Thanks a lot! – Vladir Parrado Cruz May 19 '17 at 20:04
  • 2
    This was indeed the problem with our application. The server it ran on was 1 hour in the future. – Adriaan Tijsseling Sep 07 '17 at 10:41
  • 1
    yeah i lost some hours after i reliazed that server time was wrong reboot helped simply correct the date will fix s3 problem – soField Jul 13 '18 at 08:48
8

This will also happen if your machine's time settings are incorrect

altschuler
  • 3,694
  • 2
  • 28
  • 55
  • 3
    -1 because someone already gave this answer, with a simple and effective solution to the problem, 8 months before you posted this less helpful one. – Doug McLean Apr 18 '18 at 14:39
3

In case this helps anyone, I had to add the following configuration entry for collectstatic to work and not return 403:

AWS_DEFAULT_ACL = ''
Danra
  • 9,546
  • 5
  • 59
  • 117
  • Did work for me. But it is very weird that such a setting could be related to the 403 error. It might be that the combination access key and access secret did not have permission to set acl for uploaded files. I saw that by default boto has default_acl = setting('AWS_DEFAULT_ACL', 'public-read') – Azamat Tokhtaev Nov 07 '16 at 07:45
3

It is also possible that the wrong credentials are being used. To verify:

import boto
s3 = boto.connect_s3('<your access key>', '<your secret key>')
bucket = s3.get_bucket('<your bucket>') # does this work?
s3 = boto.connect_s3()
s3.aws_access_key_id  # is the same key being used by default?

If not, take a look at ~/.boto, ~/.aws/config and ~/.aws/credentials.

kat
  • 577
  • 4
  • 7
1

Here is a refinement with minimal permissions. In all cases, as discussed elsewhere s3:ListAllMyBuckets is necessary on all buckets.

In it's default configuration django-storages will upload files to S3 with public-read permissions - see django-storages Amazon S3 backend

Trial and error revealed that in this default configuration the only two permissions required are s3:PutObject to upload a file in the first place and s3:PutObjectAcl to set the permissions for that object to public.

No additional actions are required because from that point forward read is public on the object anyway.

IAM User Policy - public-read (default):

{
   "Version": "2012-10-17",
   "Statement": [
       {
           "Effect": "Allow",
           "Action": "s3:ListAllMyBuckets",
           "Resource": "arn:aws:s3:::*"
       },
       {
           "Effect": "Allow",
           "Action": [
               "s3:PutObject",
               "s3:PutObjectAcl"
           ],
           "Resource": "arn:aws:s3:::bucketname/*"
       }
   ]
}

It is not always desirable to have objects publicly readable. This is achieved by setting the relevant property in the settings file.

Django settings.py:

...
AWS_DEFAULT_ACL = "private"
...

And then the s3:PutObjectAcl is no longer required and the minimal permissions are as follows:

IAM User Policy - private:

{
   "Version": "2012-10-17",
   "Statement": [
       {
           "Effect": "Allow",
           "Action": "s3:ListAllMyBuckets",
           "Resource": "arn:aws:s3:::*"
       },
       {
           "Effect": "Allow",
           "Action": [
               "s3:PutObject",
               "s3:GetObject"
           ],
           "Resource": "arn:aws:s3:::bucketname/*"
       }
   ]
}
Community
  • 1
  • 1
freshnewpage
  • 413
  • 1
  • 3
  • 10
0

Another solution avoiding custom policies and using AWS predefined policies:

  • Add S3 full access permissions to your S3 user.

    • IAM / Users / Permissions and Attach Policy
    • Add policy "AmazonS3FullAccess"
marcanuy
  • 23,118
  • 9
  • 64
  • 113
0

Maybe you actually don't have access to the bucket you're trying to lookup/get/create..

Remember: bucket names have to be unique across the entire S3 eco-system, so if you try to access (lookup/get/create) a bucket named 'test' you will have no access to it.

eiTan LaVi
  • 2,901
  • 24
  • 15