Font Size: A A A

Home > News > Interviews > Sharing responsibility for the cloud


Sharing responsibility for the cloud

Despite the rapid growth of cloud services, security is still the number one concern for business owners migrating their workloads to the cloud, writes James Brown, pictured, Senior Director of Global Solutions Architecture at Alert Logic.

Much has been documented about the financial, technical and efficiency advantages of moving business critical applications onto the cloud, and the impact it has on how products and services are developed, purchased and consumed, but there is still a lot of debate about whether cloud infrastructure is more, or less secure than managing applications in on-premise data centres. The simple fact is that most cloud platforms are more secure than on-premise environments. Most cloud providers have security-in-depth strategies for their global infrastructure that cover processes, people and technology that protect the physical and foundational layers of their offering. They deploy market-leading security technologies to continuously monitor and protect their hardware, software and networking environments to ensure that vulnerabilities are identified and remediated to strict SLAs as quickly as possible – often weeks or even months before the average on-premise deployment.

But, given the volume and sophistication of cyber threats, there is still a lot of confusion about who is responsible for the security and compliance of applications and workloads hosted in the cloud, and keeping them secure. Cloud services providers have made big strides in declaring the areas of security that they are responsible for, but each cloud provider is different, and the subtleties can cause confusion. The simple way to look at it is your cloud provider is responsible FOR the cloud, and you are responsible for what you put IN the cloud … and this is what we call sharing the responsibility for security in the cloud.

In a nutshell, your cloud provider is responsible for securing the foundational services, such as computer power, storage, database and networking services, but you will be responsible for the configuration of those services. At the network layer, your service provider is responsible for network segmentation, perimeter services, and external DDOS spoofing and scanning prevention. But you are responsible for network threat detection, security monitoring and any incident reporting. This means that your cloud provider will secure against attacks to switches and network within the cloud infrastructure, but they will not look for, or stop, network based attacks against your instances and applications. At the host layer, you are responsible for access management, patch management configuration hardening, security monitoring and log analysis. The application security components are 100% the company’s responsibility.

Once an organisation better understands their role and the role of their cloud provider, they will be able to make better-informed decisions concerning their cloud infrastructure. To best protect from the next vulnerability and/or wide scale attack, there are seven key best practices for cloud security that all organisations should implement.

Every company should get to know their provider, understand where the lines are drawn and then plan accordingly. Cyber-attacks are going to happen; vulnerabilities and exploits are going to be identified. By having a solid security in depth strategy, coupled with the right tools and people that understand how to respond will put your company into a better position to minimise both exposure and risk.

Understanding the shared security responsibility is an important aspect of cloud security and one that all companies need to be aware of. With cloud service providers being responsible for only part of security within the cloud, organisations must be aware of where their own responsibilities lie, and by following the best practice guide can ensure their company is as secure in the cloud as it is on premise.

Securing code is 100 per cent your responsibility, and hackers are continually looking for ways to compromise your applications. If your company is developing code, even just for basic level websites, questions should be asked about which security lifecycle development you are using. By putting this in place it your organisation will have a methodology to use internally. Code that has not been thoroughly tested and secured makes it all the more easy for potential threats to cause harm. Using encryption wherever possible will also help to secure your application, as well as testing libraries, scanning plugins for security bugs that may not have been picked up previously, and limiting privileges to only those who need access.

Logins are the keys to your kingdom and should be treated as such. Make sure you have a solid access management policy in place, especially concerning those who are granted access on a temporary basis. Defining roles and responsibilities is important as it ensures developers are unable to delete data or shutdown instances in production that they shouldn’t have access to. By defining these it also lowers the impact of a threat should someone’s login credentials become compromised. Integration of all applications and cloud environments into your corporate AD or LDAP centralized authentication model will help with this process as will two factor authentication.

With companies often rolling out patches, it is important to keep on top of an organisations inventory for production systems as unpatched software and systems can lead to major issues. Keep your environment secure by outlining a process where you update your systems on a regular basis. A combination of rolling out automatic patching and vulnerability scanning can lower the impact of any threats. Consider developing a checking of important procedure and make sure to test all updates to confirm that they do not damage or create vulnerabilities before implementation into your live environment.

Logs within the cloud are of high importance and as such log management and log reviews should be an essential component of your organisations security protocols. Logs are now useful for far more than compliance and as such they become a powerful security tool. Using log sources means a company can view who has had access to what and by continuously monitoring logs an organisation can pick up on patterns of behaviour which can then be audited to ensure they are not out of the ordinary. Logs can also be used to identify malicious activity and assist in the subsequent forensic investigation.

No single piece of software is going to handle all of your security needs and it is therefore important to implement a defence-in-depth strategy that covers all your responsibilities in the stack. Understanding which aspects of security within the cloud are your responsibility form a key aspect of this and once you have established that you can build your tool-kit out from there. Just as you have on premise it is imperative to implement IP tables, web application firewalls, antivirus, intrusion detection, encryption, and log management.

It is always important to stay informed of the latest vulnerabilities that may affect your organisation and for this the internet is a wealth of information. Use the internet to your advantage by searching for the trends, breaches and exploits that are happening in your industry and using that to educate your staff.


Related News