Love your logs like Ice Cream!
In a Cloud Environment the only legacy one leaves is the logs
Remember logs are like Ice cream, if you don’t use it they will melt. Just enabling logging will not get you anywhere. You have to put good use of the logs. And most importantly, do enjoy the ice cream! :)
Once upon a time, a long long time ago developing / building a piece of software or hosting a website was a time consuming and costly affair. We had to spend quite a large amount of money on resources. Also it would take a minimum of a month or two, before we could celebrate the win.
And then comes the Cloud in 2010, create an AMI with the software/application and launch the Instance in development environment. With CI/CD(Using like Jenkins or Spinnaker) deploy the AMI in Test/Prod environments. There are no System administrators anymore to administer your servers, your friends at Networking have laid the road for you, no need to co-ordinate with Admins or get an approval. Now deploying your application from start to finish is a matter of hours if not days with less spending on resources.
How do you keep track, of what is going on in your Cloud when one developer performs the task of a admin, Networking, DBA, developer and also a support engineer all in a span of an hour ? The answer is simple, the logs as they contain the history of every action performed.
Unfortunately your logs are not owned by you, they are owned by AWS, which is one of the biggest pit falls. So if there is one takeaway, ensure that you perform centralized logging to one AWS account where your data warehouse is present. Here is a quick work flow which represents the issue:
Here is a detailed write up on multi accounts access issues with S3 https://medium.com/@nagwww/multi-account-access-with-s3-f5693015aeef
A small request to AWS, Security is a shared responsibility, giving the logs ownership to the customers solves two problems,
- It becomes easy to process the S3 objects involving multiple AWS accounts. ( Also indirectly solves the problem of exposing S3 buckets to the world )
- Let the clients have the full ownership of the S3 objects.
In short all the S3 files/objects delivered by AWS during the logging (CloudTrail, EBL logs, VPC Flow logs, etc) are not owned by customers rather owned by AWS.
Examples showing objects being owned by AWS accounts
Here is a quick overview of the Auditing/logs that might be of interest to you(For now just AWS, hoping to cover Azure in part 2 )
- CloudTrail
- S3 access logs
- VPC flow logs
- ELB Logs
- IAM Access Analyzer
- AWS Config
- Application logs & System logs( Instances are ephemeral, Once they are gone, the only legacy they leave is the logs ) Example : syslogs, ssh logs, audit logs, etc
- RDS Audit logs to cloud watch
- Last but not the least, democratize the logs. Ensure the logs are available at disposable to all the teams to make intelligent decisions and not just limit them to security teams or when an incident happens.
1. Enable CloudTrail
- Having AWS CloudTrail logs and actively using them to monitor security-related activities within an AWS environment are two distinctly different concepts.
- Before even going the route of analyzing CloudTrail logs you might want to ensure the logs are enabled in the first place and also ensure they are in an enabled state and alert when disabled or deleted intentionally or unintentionally.
- Also recommend enabling CloudTrail at the organization level and avoid enabling at the account level.
How do you monitor the monitor ?
In the past I used Security Monkey. AWS Config really takes it to the next level by detecting and remediating the issues. In this case enabling CloudTrail when anyone disables or deletes it.
- AWS Config tracks/store/audits/alerts the state of CloudTrail. More info on the managed rule from AWS Config which is out of box, https://docs.aws.amazon.com/config/latest/developerguide/cloudtrail-enabled.html
- Next time your PCI Auditor requests you for info on PCI DSS 10.2.3 you can as well point him to AWS Config.
A few things you can alert on cloudtrail (CloudTrail is an ocean, only touching the tip of the ice berg here)
- Use CloudTrail logs for anomaly detection.
- Debugging for simple 403 or 400 errors.
- Auto-Tagging based on CloudTrail logs.
- Don’t re-invent the wheel, look out for vendors who can do this if budget is not a constraint. Please share if you have any good experience with the products in the comment section
2. S3 access logging
“It is difficult to appreciate the importance of backup and recovery till the DATA is lost.
Similarly it is difficult to appreciate the importance of compliance till a breach has happened”
In the year 2017, company after company made headlines for data breaches related to S3. Either opening the bucket to the world or exposing AWS keys which have access to the buckets, some brief stats for 2017
For the full list of all the details : S3-leaks.
Is it really tough to secure S3 ?
There are multiple ways to protect data stored in S3, which also means there are multiple ways your data can get leaked/exposed.
How to know who is using my S3 buckets ?
S3 access logs to the rescue.
To enable S3 access logging :
- Sign in to the AWS Management Console and open the Amazon S3 console at https://console.aws.amazon.com/s3/.
- Select the Bucket, navigate to → properties
3. Provide the bucket where your logs want to be shipped and click on “Save”
Recommendation engine for S3 buckets
Now you can look at your S3 access logs over a period of 90 days and lock it down to just the IAM Role that is needed.
Format of S3 access logs,
Processing S3 logs
- Dump to Hive
- Athena : http://aws.mannem.me/?p=1462
- ELK : https://medium.com/@LifeAndDev/using-the-elk-stack-to-analyze-your-s3-logs-981cb5cc5378
Your Take aways
- Lock your S3 buckets by using S3 resource policies( If you are not doing this, blame yourself )
- Remove the following from all the IAM Roles/IAM users
s3:PutBucketAcl
s3:PutBucketPolicy
s3:PutObjectAcl
Everyone out there has S3 buckets open to the world for a wide variety of reasons(From, Hosting websites to sharing files). Please ensure you enable S3 access logging on the public buckets at a minimal.
3. VPC Flow logs
A flow log record represents a network flow in your VPC. By default, each record captures a network internet protocol (IP) traffic flow (characterized by a 5-tuple on a per network interface basis) that occurs within an aggregation interval, also referred to as a capture window.
These are expensive and at the same time involves lot of computation to aggregate them. Hope AWS solves this problem. Have seen enough companies trying to solve this problem in vain. (I would be all ears if there is anyone out there who has solved this problem ). Even if you are not processing the logs ensure they are logged in case if you need it for compliance or IR incidents.
- Just because the term Security is present in “Security group” does not mean that it is got to do with the security of AWS, remember it just a firewall dictating who can and cannot communicate to an ENI.
- Monitor your Security Groups for wide open access and go with the principle of least privileges.
- Security Groups can be attached to anything that has an ENI in the AWS world and not just to EC2 instances, to name a few,
- Security groups can be attached to ELB
- Security groups can be attached to RDS instances
- Security groups can be attached to CloudHSM’s, etc
Locking down based on usage or removing access based on a model can be accomplished with VPC Flow logs. The only problem aggregating and coming up with recommendations based on the VPC Flow logs data is the sheer volume of it. Hoping one days AWS provides a service similar to AWS Access Advisor what is for IAM.
4. ELB Logs
ELB logs are disabled by default, so if you want to collect them this has to be enabled per ELb. Hope one day there will be a feature to collect them at the account level if not at the organization level. Once enabled ELB access logs are captured and stored in the specified S3 bucket in compressed format.
- ELB access contain useful information like client IPs, time of the request, request path, SSL cipher info, Protocol and many more.
- Useful to determine traffic patterns, debugging and trouble shooting issues.
- It also gives an opportunity to create models about an ELB and figure out if the ELB needs to be an external or internal ELB based on the usage patterns.
5. IAM Access Analyzer
Feel free to check out the IAM Access Analyzer beginners guide. This hardly takes like 30 min to set up and gives a birds eye view of all the gaps in your organization.
Well, knowing the gaps is not sexy, remediating and fixing them is the key.
IAM Access analyzer is a free service launched by AWS to detect anomalies in AWS resource policies which can enabled at the organization level. Yes it is just literarily click of a button, thanks to AWS for making it super easy. IAM access analyzer is regional, so you will have to enable it in every region. This can be automated with cloud formation if you are looking to enable it in all the regions.
Recommend having a different AWS account to enable Access Analyzer instead of enabling in your AWS organization account. You can add a member account in the organization as the delegated administrator to manage Access Analyzer for your organization. Here is more info on how to set this up. If you are dealing with say ten or hundred plus AWS accounts it doesn’t matter, it can be done at the organizational level which would analyze all the AWS resource policies within the AWS organization.
A, B, C’s of IAM access analyzer
Let’s find out the power of IAM access analyzer before delving into it. With the click of a button, you can detect anomalies in AWS resource policies, here are a few
- Lists SQS queues which are open to the world.
- Lists SQS queues which are accessible to AWS accounts outside your AWS organization.
- Lists AWS lambda’s which are open to the world.
- Lists AWS lambda’s which are accessible to AWS accounts outside your AWS organization.
- Lists S3 bucket policies which are open to the world.
- Lists S3 buckets which are accessible to AWS accounts outside your AWS organization.
- Lists all the IAM roles with trust policies which can be assumed from a role /account which do not belong to you.
- Lastly it’s free (yay!). You know the best things in the world come free.
- Resolved : If the resource is no longer shared outside of the zone of trust, the status of the finding is changed to Resolved. Resolved findings are deleted 90 after the last update to the finding status.
- INTERNAL_ERROR : This is the status when there is an explicit DENY. Example and S3 policy with DENY all and allow a specific IAM Role, AWS account.
Request to AWS
Please ensure the logs (CloudTrail, ELB Logs, VPC Flow logs, AWS Config etc) are owned by the Clients and not by AWS accounts which belong to AWS.
It is not just me, lot of companies out there have issues with this. As recommend by your documentation https://aws.amazon.com/premiumsupport/knowledge-center/cross-account-access-s3/ would be great if you can assume role and write. ( Yes agree this is one more step and would have scale issues).
Tl;dr : Art of securing in the cloud
- You’ve got to love logging and monitoring. If not hire some one who is passionate about it.
Years from now, one day hoping to see a Job role for logging like “Enterprise logging Architect or Senior Logging Engineer or Junior DevOps Logger”
- It is about responsibility and not technology. Technology keeps evolving, responsibility keeps one thriving.
- In a war you are either attacking or defending. So if you are in a security team it means that you are always at war because you are always defending and if you have a good Red team means you are also attacking.
- Last but not the least, be a good human. After all we all come to work spend 8 to 10 hours which is more than the time we spend with our family. Your colleagues are your second family. I have made some great friends at work and will always be thankful to them for what I am today.
- Speaking of Ice Cream if you are in South Bay, Santa Clara, recommend Nirvana one of my favorite ice cream places.
- The pandemic like for everyone changed me as well and I have made it a habit to read one book and recommend the same. Recommending one book for every blog I write. “The Monk who sold his Ferrari” Is a great book for all of us who work long hours and deeply immersed in our own world of technology.
The monk who sold his Ferrari