34
Cloud Security & Best practice in AWS

Cloud security best practices in AWS by: Ankit Giri

  • Upload
    owasp

  • View
    567

  • Download
    2

Embed Size (px)

Citation preview

Page 1: Cloud security best practices in AWS by: Ankit Giri

Cloud Security &

Best practice in AWS

Page 2: Cloud security best practices in AWS by: Ankit Giri

About Myself:

● Ankit Giri (@aankitgiri)● Associate Security Consultant | TO THE NEW Digital● Web and Mobile Application Security Researcher● Bug Hunter (Hall of Fame: EFF, GM, HTC,Sony, Mobikwik, Pagerduty and some

more )● Blogger, Orator and an active contributor to OWASP and null Community● The Most Viewed Writer in Web application Security, Network Security and

Penetration Testing on Quora

Page 3: Cloud security best practices in AWS by: Ankit Giri

About Today's Talk:

● Cloud Security:● Some interesting instances of breach● Best practices to protect AWS account from unauthorized access and usage● What and How to look for security loopholes● Audit scripts● What one should learn to safeguard Cloud application?

Page 4: Cloud security best practices in AWS by: Ankit Giri

Few instances of breach (Live examples)

Account compromise via leak of AWS Keys on GitHub : I woke up one morning, only to find few emails and a missed phone call from an ISD Number (it was from Amazon AWS) - something about 140 servers running on my AWS account.

Page 5: Cloud security best practices in AWS by: Ankit Giri

Few instances of breach (Live examples)

Account compromise via leak of AWS Keys on GitHub :

What was my mistake in this case?"I only had keys in some of my scripts I had written for some client. And we had a habit of pushing the code to GitHub. And they were gone within five minutes!!

"As it turns out, through the API calls you can actually spin up EC2 instances, and my key had been spotted by a bot that continually searches GitHub for API keys."

Amazon support staff told me that such bot exploits were increasingly common with hackers running algorithms to perpetually search for GitHub for API keys.

Once it finds one it spins up max instances of EC2 servers to farm itself Bitcoins.

Page 6: Cloud security best practices in AWS by: Ankit Giri

The root cause :The secret keys are issued by Amazon Web Services when users open an account and provide applications access to AWS resources.

When opening an account, users are told to “store the keys in a secure location” and are warned that the key needs to remain “confidential in order to protect your account”.

AWS reminds subscribers that "anyone who has your access key has the same level of access to your AWS resources that you do. Consequently, we go to significant lengths to protect your access keys, and in keeping with our shared-responsibility model, you should as well."

However, a search on GitHub reveals thousands of results where code containing AWS secret keys can be found in plain text, which means anyone can access those accounts.

Search it on your own (But after this presentation gets over :) )

Page 7: Cloud security best practices in AWS by: Ankit Giri

Another Case where this happened!Dev Factor founder Andrew Hoffman said he used Figaro to secure Rails apps which published his Amazon S3 keys to his GitHub account.

He noticed the blunder and pulled the keys within five minutes, but that was enough for a bot to pounce and spin up instances for Bitcoin mining.

"When I woke up the next morning, I had four emails and a missed phone call from Amazon AWS - something about 140 servers running on my AWS account," Hoffman said.

"I only had S3 keys on my GitHub and they were gone within five minutes!"

"As it turns out, through the S3 API you can actually spin up EC2 instances, and my key had been spotted by a bot that continually searches GitHub for API keys."

Amazon (he said) told him such bot exploits were increasingly common with hackers running algorithms to perpetually search for GitHub for API keys.

Page 8: Cloud security best practices in AWS by: Ankit Giri

Few instances of breach (Live examples)

❖ Several bloggers have admitted getting a shock after receiving a large bills for bandwidth usage they didn't initiate. For example, Luke Chadwick was hit with a US$3493 (A$3842) bill in December, because of unauthorised activity. To his relief, this was later refunded by AWS.

❖ Earlier this year, AWS contacted Rich Mogull, analyst and CEO of Securosis after three days of unusual activity on his account had run up US$500 in charges. In a blog post from January, Mogull said he had mistakenly published his AWS secret key on GitHub.

"I did not completely scrub my code before posting to GitHub. I did not have billing alerts enabled ... This was a real mistake ... I paid the price for complacency," he admitted in his blog.

Page 9: Cloud security best practices in AWS by: Ankit Giri

What AWS thinks of these breaches?

In a statement, AWS told iTnews it "takes security very seriously and provides many resources, guidelines and mechanisms to help customers configure AWS services and develop applications using security best practices."

"However, developers are responsible for following our guidance and utilising those mechanisms. When we become aware of potentially exposed credentials, we proactively notify the affected customers and provide guidance on how to secure their access keys," the statement said.

Page 10: Cloud security best practices in AWS by: Ankit Giri

What is SSRF Attack ?

Server Side Request Forgery (SSRF) is a vulnerability that appears when an attacker has the ability to create requests from the vulnerable server.

Usually, Server Side Request Forgery (SSRF) attacks target internal systems behind the firewall that are normally inaccessible from the outside world (but using SSRF it’s possible to access these systems). With SSRF it’s also possible to access services from the same server that is listening on the loopback interface.

Using Server Side Request Forgery attacks it’s possible to:

➢ Scan and attack systems from the internal network that are not normally accessible

➢ Enumerate and attack services that are running on these hosts

➢ Exploit host-based authentication services

Page 11: Cloud security best practices in AWS by: Ankit Giri

SSRF Attack (Live Example)

Recently a client got a mail from some organization complaining about random request on their servers from one of the client’s IP. On investigating the issue, we found out that the complaint was correct. And there have been a increased number of Network Out data from our instances, and hence resulting in high billing on the account.

The approach followed was:● Search for the unrecognised Process in the server as at this time "chaos" named process was

found● Run the command: http://freecode.com/projects/lsof/

● lsof -p <pid of that process>● Look whether this process is hitting some IP on the network.● Now look for IP on the Net, at this time IP was found and the IP was of Hong Kong Trade

Center.● Don't try to kill the process as this process will restart again as, this copies of this process would

be present anywhere on the server.● Just Terminate this instance and launch a new server.● Look for the Security Group● Close all the ports except 80 which are opened for all

Page 12: Cloud security best practices in AWS by: Ankit Giri

A similar issue was reported in “Pocket”

Here is the link:

https://news.ycombinator.com/item?id=10078967

Page 13: Cloud security best practices in AWS by: Ankit Giri

There's a Hole in 1,951 Amazon S3 BucketsWe discovered 12,328 unique buckets with the following breakdown:

Public: 1,951Private: 10,377

Approximately 1 in 6 buckets of the 12,328 identified were left open for the perusal of anyone that's interested. These 12,328 buckets were skewed towards those we could identify based on domain name, word list, or use within web sites. From the 1,951 public buckets we gathered a list of over 126 billion files. The sheer number of files made it unrealistic to test the permissions of every single object, so a random sampling was taken instead. All told, we reviewed over 40,000 publicly visible files, many of which contained sensitive data.

Page 14: Cloud security best practices in AWS by: Ankit Giri

❖ Some specific examples of the data found are listed below:➢ Personal photos from a medium-sized social media service➢ Sales records and account information for a large car dealership➢ Affiliate tracking data, click-through rates, and account information for an ad company’s

clients➢ Employee personal information and member lists across various spreadsheets➢ Unprotected database backups containing site data and encrypted passwords➢ Video game source code and development tools for a mobile gaming firm➢ PHP source code including configuration files, which contain usernames and passwords➢ Sales “battlecards” for a large software vendor

Page 15: Cloud security best practices in AWS by: Ankit Giri

Root Cause : The root of the problem isn't a security hole in Amazon's storage cloud, according to Vandevanter. Rather, he credited Amazon S3 account holders who have failed to set their buckets to private -- or to put it more bluntly, organizations that have embraced the cloud without fully understanding it. The fact that all S3 buckets have predictable, publicly accessible URLs doesn't help, though.

What makes it a common vulnerability? Just because a file is listed in a bucket doesn't mean it can be downloaded, buckets and objects have their own access control lists (ACLs). However, if a user does lock down files within a public bucket, a data thief could still glean potentially sensitive information from the file names, including customer names or the frequency which with applications are backed up

Contributing to the problem may be the fact that all S3 buckets have a unique, predictable, and publicly accessible URL, which makes it easy to track down buckets and determine which are private and which are public. By default this URL will be either http://s3.amazonaws.com/[bucket_name]/ or http://[bucket_name].s3.amazonaws.com/. If you enter the URL and receive an Access Denied response, the bucket is private. If it's public, you'll be presented with the first 1,000 objects stored therein.

Page 16: Cloud security best practices in AWS by: Ankit Giri

Publicly accessible S3 Bucket, folders and files :

Page 17: Cloud security best practices in AWS by: Ankit Giri

Publicly accessible S3 Bucket, folders and files :

Hi All,I know that hackerone-attachments is used for file uploads on reports and so I did a quick scan for similar buckets and found ████████████████. While I can't confirm if you own it or not, it appears that it is publicly writable using the aws cli.When I tried to write to hackerone-attachments, I get:"move failed: ./test.txt to s3://hackerone-attachements/test.txt A client error (AccessDenied) occurred when calling the PutObject operation: Access Denied.However, when I write to ████████████████, I get:move: ./test.txt to s3://████████████████/test.txtHopefully the bucket is yours and this isn't a waste of time. If you do own it, a good thing is the bucket is not publicly readable and the file appears private by default after being written. However, assuming you own it, the security issue would be someone writing something malicious and someone on your team unknowingly opening it.

PeteSecurity Researcherhttps://hackerone.com/reports/128088

Page 18: Cloud security best practices in AWS by: Ankit Giri

Publicly accessible S3 Bucket, folders and files :

HackerOne rewarded yaworsk with a $2,500 bounty. Apr 4th (6 days ago)Thanks again for the great report, @yaworsk.

It's always great to see researchers thinking outside the normal web app confines and considering other parts of technical infrastructure that could affect a company. While you weren't able to read any contents from that bucket (and even if you would, everything is encrypted using AES-256), an attacker could theoretically post a file into that bucket that may at some point be accessed by a HackerOne staff member, thinking it's been uploaded by another staff member or some automated system.

This issue also led us to audit some of our additional S3 buckets (we have quite a few), and we found some similar issues among several of them. As such, we are increasing the bounty to take those additional mitigations into account, even though they weren't discovered by you.

Look forward to seeing what you find next! Happy hacking. :-)

Page 19: Cloud security best practices in AWS by: Ankit Giri

Another Case where this happened!

Recently an Amazon S3 bucket (http://shopify.com.s3.amazonaws.com/) was unintentionally left with directory listing enabled. Even though the files in the bucket were all publicly accessible, it was not intended for the directory listing to be visible

https://hackerone.com/reports/57505

Page 20: Cloud security best practices in AWS by: Ankit Giri

Bucket Finder script :

How to use and exploit scenario:

http://hackoftheday.securitytube.net/2013/04/finding-publicly-readable-files-in-your.html

https://digi.ninja/projects/bucket_finder.php

Page 21: Cloud security best practices in AWS by: Ankit Giri

I was recently searching for something on Google and came across this instance of what might be a logical vulnerability prevailing across multiple web applications. I was searching for publicly accessible Jenkins console through Google Dorking. My search query listed some of the websites that had Jenkins as a part of their domain name. Although this itself is not a security issue but it reveals the fact that Jenkins is being used as a CI (Continuous Integration) tool and it’s console is publicly accessible. The following information may be used to plan a chain of attacks.

Jenkins can lead to a disaster?

Page 22: Cloud security best practices in AWS by: Ankit Giri

What took me by surprise is that I found many Jenkins console with no authentication mechanism enabled on them. I can easily open the console, read the usernames, the build history, logs of builds and much more. This made me think about the necessity of having proper security implemented on a Jenkins console. So, this blog post will be covering the devastating effect of a compromised Jenkins server and how to protect this from happening.

Jenkins can lead to a disaster?

Page 23: Cloud security best practices in AWS by: Ankit Giri

Treasure of credentials such as AWS Access keys and Secret key.Impact: Anyone can use the AWS account linked to the Access keys and Secret keys to launch resources which will lead to the owner getting billed for it. A person possessing these credentials gets full access to the AWS accountThe server’s pem files, IP addresses, usernames, email address etc.Impact: The disclosure of the above mentioned information will lead to logging into the server (remember there may be a hundred of servers accessible from this console), running arbitrary commands on it, getting access to users and their respective password ( and what not! ).The revealed username and email address will also enable the attacker to plan much more sophisticated attacks on the organization as he is now much more aware of the developer and other accounts that are present. This can also result in him brute forcing other applications such as the Admin, CMS portal of the application.

Why compromised Jenkins can lead to a disaster?

Page 24: Cloud security best practices in AWS by: Ankit Giri

GitHub SSH KeySSH keys are used to identify trusted computers, without involving passwords. Since, Jenkins has availability of Git plugin which enables us to run Git commands from the Jenkins console itself. So, the publicly accessible Jenkins console will enable an attacker to view, modify, update the code of the production application. In the worst scenario possible, the attacker can issue a “git branch -m production” command and delete all the existing code and thus bring the application down by deleting the “production” branch containing the code of the application.

S3 BucketThe Jenkins console might have access to private S3 buckets containing content such as images, log files, code backups etc. This access to essential data can be abused to delete important files and folders.

Page 25: Cloud security best practices in AWS by: Ankit Giri

A public cloud is one based on the standard cloud computing model, in which a service provider makes resources, such as applications and storage, available to the general public over the Internet. Public cloud services may be free or offered on a pay-per-usage model.

The main benefits of using a public cloud service are:

# Easy & inexpensive set-up because hardware, application & BW costs are covered by provide

# Scalability to meet needs.

# No wasted resources because you pay for what you use.

AWS - Public Cloud

Page 26: Cloud security best practices in AWS by: Ankit Giri

The term "public cloud" arose to differentiate between the standard model and the private

cloud, which is a proprietary network or data center that uses cloud computing technologies,

such as virtualization. A private cloud is managed by the organization it serves. A third model,

the hybrid cloud, is maintained by both internal and external providers.

Examples of public clouds include Amazon Elastic Compute Cloud (EC2), IBM's Blue Cloud,

Sun Cloud, Google AppEngine and Windows Azure Services Platform.

AWS - Public Cloud

Page 27: Cloud security best practices in AWS by: Ankit Giri

Amazon VPC•Amazon Virtual Private Cloud (Amazon VPC) lets you provision a logically isolated section of the

Amazon Web Services (AWS) Cloud where you can launch AWS resources in a virtual network that

you define. You have complete control over your virtual networking environment, including selection

of your own IP address range, creation of subnets, and configuration of route tables and network

gateways.

•You can easily customize the network configuration for your Amazon Virtual Private Cloud. For

example, you can create a public-facing subnet for your webservers that has access to the Internet, and

place your backend systems such as databases or application servers in a private-facing subnet with no

Internet access. You can leverage multiple layers of security, including security groups and network

access control lists, to help control access to Amazon EC2 instances in each subnet.

•Additionally, you can create a Hardware Virtual Private Network (VPN) connection between your

corporate datacenter and your VPC and leverage the AWS cloud as an extension of your corporate

datacenter.

Page 28: Cloud security best practices in AWS by: Ankit Giri

● Disable root access key and secret key.● Enable MFA tokens for each IAM user.● Have minimum IAM users with Admin rights.● Use Roles for EC2 instances for access permissions wherever possible.● Provide least privileges and limit IAM entities actions with strong/explicit policies.● Rotate all the keys on a regular basis (say 60 days).● Use IAM roles with STS AssumeRole wherever possible.● Do not allow 0.0.0.0/0 for any EC2/ELB security group.● Apply proper S3 bucket policies with public access to only files instead of sharing a

bucket/folder.● Create all resources within a VPC.● Make sure only required ports are open.

Best practices to protect AWS account from unauthorized access and usage:

Page 29: Cloud security best practices in AWS by: Ankit Giri

●Ensure all Back-end servers i.e. database servers are accessible via web servers only.

●Having separate IAM user for each team member.

●Attach all policies on groups and include users in a group as per access requirement.

●Avoid making server_Status page accessible publicly.

●Ensure all SSL certificates follow SHA2 encryption.

●For details about migrating to SHA2 :

Set alarms on billing using Amazon CloudWatch.

This practice can be very useful to detect DDoS attacks and high data transfer occurrences.

For steps to set alerts on billing:

http://docs.aws.amazon.com/awsaccountbilling/latest/aboutv2/free-tier-alarms.html

Some more Best practices to protect AWS account

Page 30: Cloud security best practices in AWS by: Ankit Giri

● Make sure bash terminal on all servers are ShellShock proof.

● Following is a command test to check if our Bash is vulnerable to ShellShock:

○ env var='() { ignore this;}; echo vulnerable' bash -c /bin/true

○ A vulnerable version of bash will give a output “vulnerable”.

● If any patch has been applied, the same command test will return:○ bash: warning: var: ignoring function definition attempt

bash: error importing function definition for 'var'

Page 31: Cloud security best practices in AWS by: Ankit Giri

•Using AWS WAF to block malicious requests

Best practices to protect AWS account from

unauthorized access and usage:

Page 32: Cloud security best practices in AWS by: Ankit Giri

● There is an Audit script that scans our AWS infrastructure for Security loopholes.● Description and its need● Center for Internet Security compliance and best practices● Hardened AMIs provided by CIS:

https://aws.amazon.com/marketplace/search/results/ref=dtl_navgno_search_box?page=1&searchTerms=center+for+internet+security

● AWS Whitepaper

Some in-house solutions:

Page 33: Cloud security best practices in AWS by: Ankit Giri

Best practices to protect AWS account from unauthorized access and usage:

Page 34: Cloud security best practices in AWS by: Ankit Giri

Thankyou….