Beginners guide to securing AWS S3

Nag Medida
8 min readSep 6, 2017

There are multiple ways to protect data stored in S3, which also means there are multiple ways your data can get leaked knowingly or unknowingly. To name a few of S3 leaks which made it to the news,

Here are a few tips for beginners to securing S3,

Tip # 1 : Get the basics right first!

So why are the basics so important? Because the basics never change.

AWS continues to innovate at an astonishing rate, added around 1000 plus new features every single year, up from 722 just last year alone according to a chart posted by CEO Andy Jassy during his re:Invent keynote.

Note that no matter how many new features come, the basics still remain the same, get your foundation strong, here is a quick run book with 200+ S3 examples.

If you work in a centralized team, developers like you give them sudo code. Many a times solving a problem takes a fraction of time as compared to identifying it. . Here is my take on boto3/boto2/AWS CLI documentation,

  • Take responsibility … It is your data,
  • Master your subject,
  • be passionate about it, you don’t have to sleep and dream about it.
  • Responsibility about your data is the key..

One man’s trash is other man’s treasure …

Tip # 2 : Know thy assets you are protecting.

To secure your assets, you first have to know what and where they are. If you don’t know what you are trying to protect then you will never know how to protect it.

Security Monkey monitors AWS S3 buckets across multiple AWS accounts and,

  • Acts as the source control for your S3 buckets policies, ACL, lifecycle rules.
  • Generates an audit report of all the current issues ( Ex : AWS S3 buckets which are accessible to everyone shared across unknown AWS accounts and have conditional statements )
  • Creates an e-mail alert when a S3 bucket is added or deleted.
  • AWS S3 resource policies are used to grant fine grain access controls for S3 buckets and objects. All the ACL’s and policies are stored in security monkey which triggers alerts when changes are done. Comes handy when you have sensitive S3 buckets and you want to monitor for changes.
  • Tracks S3 buckets for bucket-level encryption.
  • Tracks versioning of buckets.
  • Tracks the lifecycle object of an S3 bucket. Lifecycle rules enable you to automatically archive/delete S3 objects based on predefined rule sets.
  • Monitors S3 ACL’s and bucket policies since last check and alerts when buckets are publicly accessible.

https://medium.com/@nagwww/secops-with-security-monkey-2ad26cccd5ec

Tip # 3 Go by the principle of least of least privileges

Do not give the ability to make your data public so you don’t have to worry about monitoring and thinking of the data becoming public.

Look for these across all your AWS accounts and restrict them,

- s3:*
- s3:PutBucketAcl
- s3:PutBucketPolicy
Limit the above calls to your Security team.
Now this is a big NO. I wish AWS removes this call.
- s3:PutObjectAcl

Tip # 4: Infrastructure as a code ( IAAC )

Infrastructure as Code has emerged as a best practice for automating the provisioning of infrastructure services. Treat your configuration as code.

S3 Policies : AWS S3 resource policies are used to grant fine grain access controls for S3 buckets and objects.

ACL : Enable you to manage access to buckets and objects. Each bucket and object has an ACL attached to it as a sub resource.

All the ACL’s and policies are stored in security monkey which triggers alerts when changes are done. Comes handy when you have sensitive S3 buckets and you want to monitor for changes.

There are two types of guards that can be used to protect your data,

  1. IAM Policies
  2. S3 Resource policies

Treat your IAM Policies and S3 resource policies as code and ensure you peer review before applying them in your production environment.

Tip # 5 Restrict access to your buckets by VPC

A VPC endpoint enables you to create a private connection between your VPC and another AWS service without requiring access over the Internet. To create a VPC endpoint for S3,

  1. Navigate to VPC → Endpoints → Create Endpoints
  2. Select your VPC → Select the service as S3 that’s it you are done.

Create an S3 resource policy on your bucket to acknowledge requests only from your VPC as,

More info on VPC endpoints

Tip # 6 Restrict access to S3 buckets by IAM Role

An IAM role created with “S3:*” has the power to create/delete all the S3 buckets, objects in it’s AWS account. This is because when you create a new S3 bucket, the S3 resource policy by default grants read/write/delete to the users/roles in its AWS account.

To restrict bucket to a particular IAM Role, here are the few things you can do,

  1. Deny access to all the IAM Roles with a NOT condition whitelisting the IAM Role.
  2. Add a condition to whitelist the IAM role you want to grant access,

Here is an example policy to grant access to an S3 bucket to just one particular IAM role and no one else.

Note : aws:userId is the Role ID.When IAM creates a user, group, role, policy, instance profile, or server certificate, it assigns to each entity a unique ID that looks like the following example:AIDAJQABLZS4A3QDU576Q

What to do if a S3 bucket is locked out …..

If you get locked out when you place the "DENY" rule, you will have to use the AWS root keys to get back into the bucket. In the case of an explicit deny, you will need to use root access keys. Typically you don't want access keys for the root account, but can create them long enough to execute the necessary commands and then delete them. aws s3api delete-bucket-policy --bucket your_bucket_name

Tip # 7 : Restrict access to your S3 bucket by IP

If there are scenarios where you really want to restrict access to your S3 buckets/objects by an IP or an EIP here is an S3 resource policy you can set to restrict it by an IP

Tip # 7 Encrypt your data in S3.

Easier said then done. I would not expect to encrypt your log files, however if there is a scenario where you are storing sensitive information certainly please do encrypt your data. If you are using S3 for storing your database snapshots please ensure they are encrypted. Here is a quick example of how you can encrypt your data using KMS

http://docs.aws.amazon.com/AmazonS3/latest/dev/UsingClientSideEncryptionUpload.html

Tip #8 Have a standard for creating/updating/deleting S3 buckets and policies.

There are DBA’s to administer your database. Treat S3 the same way have a team/person responsible for creating/deleting S3 buckets and policies. Limit the scope of the exposure on who can add/update/delete S3 policies.

Have a different AWS account where you keep your host buckets which are public or where you want to have static web hosting.

Tip # 9 Create actionable alerts

“Alerting is of no use if it is not actionable”. Make sure you get alerts on the ones you are willing to take an action or automate the remediation. If you get an alert stating the bucket is made public ensure the policy or the ACL is removed by writing a small script.

Here are few alerts which are easy to create in SecurityMonkey,

  • Monitor for bucket which are made public.
  • Exposure of data in S3 to authenticated users.

Tip # 10 Logs, Logs, Logs

Love your logs. Logs are like ice cream, if you keep them out, they are going to melt and will be of no use, consume them at your earliest and enjoy. Yes, use them before they melt.

S3 access logs :

  • Gives you a trail of who is accessing your objects.
  • Gives you a trail of which actions are being performed ( GET/POST/DELETE )

Based on the above fine tune your AWS S3 resource policies.

CloudTrail logs :

  • AWS by default do not log the GET/PUT for S3 into cloudtrail.
  • CloudTrail makes it easier to ensure compliance with internal policies and regulatory standards.

Base on your CloudTrail logs fine tune you AWS IAM policies.

Tip # 11 : Think of scenario’s where your AWS keys get lost ,

With security it always a question of “if” …

Estimate the damage that can be done to your sensitive S3 buckets in the above scenarios, probably you might want to have some mitigation strategies,

  • AWS account segregation to limit the exposure,
  • Locked down your S3 buckets by VPC,
  • Locked down your S3 buckets by IP
  • Encrypt your data, etc.
  • Lastly have a DR strategy, use other cloud provider (like GCS, Azure) to backup your S3 buckets here is an example on how to do it for GCS,
  1. Create an IAM user in AWS ( Follow tip : 7 to restrict by IP )
  2. Login to GCS and provide the following as show below to backup your S3 buckets.

More info : https://cloud.google.com/storage/transfer/

Conclusion :

  • S3 is fun…
  • Security doesn’t have to be a daunting task, it can be fun too…
  • There are lot of free/open source tools out there which will help you accomplish this, here are a few which I know of, feel free to share if you know of others,
  1. SecurityMonkey takes about 10 min to get you up and running with a docker image.

2. Amazon Macie

If you need more tips sure, feel free to reach out or send me a note. Yes feedback is always welcome …

Thanks to https://www.pexels.com for all the cute cat pics and Shashi ( https://github.com/smadappa) for the review

--

--