Services like Amazon’s S3 have made it easier and cheaper than ever to store large quantities of data in the cloud.
Used properly, S3 buckets are a useful tool, however a lot of companies fail to implement basic security resulting in catastrophic data breaches.
Amazon Simple Storage Service (S3) provides the ability to store and serve static content from Amazon‘s cloud.
S3 could be used to store server backups, company documents, web logs, and publicly visible content such as web site images and PDF documents.
Files within S3 are organized into “buckets”, logical containers accessible at a predictable URL with ACL that can be applied to both the bucket itself and to individual files and directories.
A bucket is typically considered “public” if any user can list the contents of the bucket, and “private” if the bucket’s contents can only be listed or written by certain S3 users: a public bucket will list all of its files and directories to an any user that asks.
Checking if a bucket is public or private is easy
All buckets have a predictable and publicly accessible URL like this:
To test the openness of the bucket a user can just enter the URL in their web browser:
- a private bucket will respond with “Access Denied”.
- a public bucket will list the first 1,000 objects that have been stored.
The security risk from a public bucket is simple: if a bucket has been marked as “public”, exposes a list of sensitive files, and no access controls have been placed on those files.
How to find unsecure S3 buckets, and how to check security of mine?
There are a lot of automated tools, here my own shortlist.
Trawl Amazon S3 buckets for interesting files:
Each group of files on Amazon S3 have to be contained in a bucket and each bucket has to have a unique name across the system. This means that it is possible to bruteforce names, this script does this and more.
A script to find unsecured S3 buckets and dump their contents, developed by Dan Salmon.
The tool has 2 parts:
- s3finder.py, a script takes a list of domain names and checks if they’re hosted on Amazon S3
- s3dumper.sh, a script that takes the list of domains with regions made by s3finder.py and for each domain, it checks if there are publicly readable buckets and dumps them if so.
Tool to check bucket permissions, compatible with Linux, MacOS and Windows, python 2.7 and 3. Developed and maintained by Kromtech.
What it does
- Checks all your buckets for public access
- For every bucket gives you the report with:
- Indicator if your bucket is public or not
- Permissions for your bucket if it is public
- List of URLs to access your bucket (non-public buckets will return Access Denied) if it is public
A tool similar to a subdomain bruteforcer but is made specifically for S3 buckets, developed by Jordan Potti.
it has some extra features that allow you to grep for delicious files as well as download interesting files if you’re not afraid to quickly fill up your hard drive.