Commit f6827a9a by Kevin Falcone

Change to validate=False which emulates django-storages

On many buckets, we use s3://bucket/path to separate environments
because AWS used to have a really low limit on the number of buckets
we could have, and we wanted to share them across environments.

If you combine this with an IAM policy that only allows you access
to the s3://bucket/path that you "own", then get_bucket fails.
http://boto.cloudhackers.com/en/stable/ref/s3.html#boto.s3.connection.S3Connection.get_bucket
The HEAD appears to be similar to get_all_keys which requires
ListObjects on the bucket with no conditions.

django-storages (Which is what we write new S3 access in) actually
always passes validate=False unless you're allowing it to create buckets
for you (which we never do).
https://github.com/jschneier/django-storages/blob/1.4.1/storages/backends/s3boto.py#L320-L334
parent 1d8d44ee
...@@ -431,7 +431,11 @@ def storage_service_bucket(): ...@@ -431,7 +431,11 @@ def storage_service_bucket():
settings.AWS_ACCESS_KEY_ID, settings.AWS_ACCESS_KEY_ID,
settings.AWS_SECRET_ACCESS_KEY settings.AWS_SECRET_ACCESS_KEY
) )
return conn.get_bucket(settings.VIDEO_UPLOAD_PIPELINE["BUCKET"]) # We don't need to validate our bucket, it requires a very permissive IAM permission
# set since behind the scenes it fires a HEAD request that is equivalent to get_all_keys()
# meaning it would need ListObjects on the whole bucket, not just the path used in each
# environment (since we share a single bucket for multiple deployments in some configurations)
return conn.get_bucket(settings.VIDEO_UPLOAD_PIPELINE["BUCKET"], validate=False)
def storage_service_key(bucket, file_name): def storage_service_key(bucket, file_name):
......
Markdown is supported
0% or
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment