Securing Enterprise Data in AWS

Posted by: Brent Martin in Cloud Computing

Tagged in: Untagged 

Brent Martin

Sure, Amazon Web Services is great for geeks and small companies with nothing to lose, but what if your company is in the Fortune 1000? Can you really trust Amazon (or any cloud provider for that matter) with your employees, vendors, customers, and all of the associated confidential data like credit card numbers and tax ID’s?

That’s something I’ve been struggling with lately. We have successfully used AWS to spin up demo environments of PeopleSoft Enterprise, Hyperion, GoldenGate. We used demo data as opposed to any customer-specific data which has been fine so far. And I must say that AWS is an incredible tool to get the software up and running and build a sandbox for evaluation purposes. But the next logical step would be to use AWS to host conference room pilot and prototyping environments to support our initial requirements gathering efforts. Beyond that we’ll want to build Dev and Test environments. And all of these environments will need real data from existing systems.

We can certainly procure the hardware and host these environments in house, but I’m thinking about AWS because we could postpone our hardware purchases until closer to the end of the requirements gathering phase when we can be more precise about our needs. So I did some research and here are some of the things I’ve learned.

 

Amazon Web Services wants your Enterprise Business

 

For large enterprise customers, Amazon Web Services can assign you an enterprise account manager (a real person, not a piece of software) who can be the liaison between your company and Amazon’s sales team, solution architects, business architects, etc. The account managers can organize calls with the appropriate people to answer your questions and discuss topics at length with you – although some discussions require signing a non disclosure agreement first.

But better yet, enterprise customers can negotiate their own special contracts with Amazon. So if your CIO is looking for certain pricing targets, SLA’s, etc., Amazon will be glad to negotiate with you and put it in writing.

 

Other companies are doing it

Amazon Web Services does have reference customers. But you’ll have to talk to your account manager and get an NDA in place before you’ll be able to talk to them.

It is possible to meet some compliance requirements on the AWS platform

There is a whitepaper regarding achieving HIPAA compliance on the AWS platform. And you can achieve PCI DSS levels 2-4 for applications running on the AWS platform. PCI DSS level 1 is not yet possible due to the requirement for the customer’s auditor to physically audit the data center. Keep in mind that compliance is a shared responsibility between your enterprise and Amazon, and Amazon will do what it takes to meet it on their side. Short of giving every company’s auditor access to their data centers that is.

Amazon Web Services Physical Security is independently audited

If you’ve read the AWS security white paper, you know that AWS claims their physical security is formidable. Amazon backs up those claims with a SAS70 Type II audit report. Amazon has publicly stated that it is a non-qualified report meaning there were no deficiencies with the SAS70 certification. If you’d like to review the full report, you can request it by talking to your enterprise account manager and signing Amazon’s NDA.

 

S3 Storage is maybe too reliable

Amazon provides S3 storage for general off-line disk storage and backup purposes. Access security on S3 is pretty good – it allows you to set ACL’s on each file which lets you control access at a very granular level. But physical security on S3 is even better. Files backed up to S3 are pretty much guaranteed to be there, even if Amazon looses two data centers. The fact is that it has a durability guarantee of eleven nines. That may be more than you need for your typical daily database export. That’s why Amazon recently released its reduced redundancy storage (RRS) at a more modest price point that has a durability guarantee of “only” four nines. Even with S2, Amazon could lose one data center and your backups will still be available.

 

The cloud has unique security requirements

There are a lot of different considerations when you’re thinking about securing your corporate data in the cloud. Case in point: The AWS Console. The AWS console is what you use to provision new AWS instances and disk volumes, back them up, build firewall rules, remove backups, etc. At the time of writing, the AWS console can be accessed from anywhere, and once you have access you have full access because there is no concept of security by role. You can certainly use multi-factor authentication through Gemalto to help lock down the console itself, but the underlying API’s that do all of the work (and can also be invoked from anywhere) use different security credentials which bypasses the multi-factor authentication. In my opinion this is the biggest of my security concerns. Rumor has it that Amazon has a project in the works to address at least some of these concerns. In the mean time, I’m going to look at Gemalto for multi-factor authentication, and come up with a process to regenerate the security credentials the API’s use on a regular basis.

Another security feature that’s unique to AWS instances is how Amazon implements network security. Packets are only delivered to the destination addresses and no other devices will ever see that packet. While that eliminates many traditional threat vectors, it also makes intrusion detection approaches like NIDS/NIPS basically useless in AWS. So for intrusion detection in AWS you must look to host-based approaches.

When you’re working on internal environments in your corporate data center, you don’t have to worry so much about data that’s in flight between the client and the servers. But with AWS everything goes across the public internet, so you need to be sure to encrypt everything. Using SSL for browser communication, SSH, SFTP, and certificate-based RDP must be the new normal.

One of AWS’s new offerings is the Virtual Private Cloud (VPC) which lets you set your servers up in their own subnet on AWS, and connect them to a secure spot on your corporate network. This makes the servers appear as though they are part of your corporate network, encrypts all traffic, and keeps internet traffic out of your servers. So it works great for enterprise apps (like ERP, EDW, BI tools) that are typically accessed from inside the firewall. The only down-side is that ALL traffic to and from the servers must flow from AWS back to your data center, then to its final destination (even if that final destination is a server on AWS). So if you’re wanting to make a backup of a database to S3, it probably won’t be too speedy and you might get a lot of traffic through your internet gateway.

While we’re on the topic of network security differences, I should mention that Amazon implements “Security Groups”, which are like firewalls without the whistles and bells. They don’t support egress filtering, only ingress. You can certainly work around this limitation by using VPC, or by implementing appliance instances to serve as a traditional firewall.

And when you’re running on the cloud, you need to be mindful of your data “at rest”. Access to S3 backups can certainly be controlled using the ACL functions, but it’s a good idea to encrypt the files before you write them to S3 in the first place so even if somebody gets an ACL wrong the data will be unreadable by unauthorized users. And you should consider using encrypted file systems on your AWS hosts and/or using database encryption technology to add yet another layer of security to keep your sensitive data locked away.

 

Trackback(0)
Comments (0)Add Comment

Write comment

security code
Write the displayed characters


busy