Cloud 101CircleEventsBlog

Cloud Data Protection

Cloud Data Protection

Blog Article Published: 07/21/2022

Written by Luigi Belvivere, Elena Minghelli, and Sara Frati of NTT DATA.

Introduction

In the digital era and its digital transition, business and institutions have clearly understood that a robust cloud security is essential.

It is well known that security threats evolve in parallel with the evolution of technology and are becoming more and more refined.

In this historical period of geopolitical uncertainty, cloud computing is not less at risk than on-premises environments.

Cloud computing is composed of distributed resources that are highly interdependent. That is why, using the traditional approach to security is no longer feasible.

Cloud computing, by definition, is characterized by a dynamic use of shared resources. Storage and processing resource provisioning happens automatically based on the demand. In other words, cloud computing can satisfy the peak demand using auto-scaling processes that bring in new processing resources (virtual machines or containers) and deploy multiple instances of an application as needed.

Scenario

With process automation in mind, there is a combination of approaches that embraces the concepts of Security by Design, Secure Software Development Life Cycle and DevOps.

Indeed, the deployment of services offered by major CSPs such as Kubernetes, Docker, Gitlab etc that enhance the entire CI\CD (continuous integration continuous development) pipeline has enabled developers to increase agility and scalability by leveraging cloud infrastructures to release new software versions in a timely manner.

In this increasingly interconnected economy of scale, the attack surface expands its reach by affecting the entire software supply chain and attacking the weak link in the pipeline as developers are more focused on speed than security.

It is precisely in such scenario that malicious activities find fertile ground.

Risks

It is now possible to distinguish two types of attacks

1. Endogenous attacks
  • Cloud Provider Malicious Insider
  • Insecure data deletion and compromise of encryption keys
  • Conflicts between the organization's hardening procedures and those of the Cloud Provider
2. Exogenous attacks
  • Software compromise that introduces malware into the entire supply chain. (build component)
  • Using unauthorized tools (Shadow it) not under the IT department direct control

The above-mentioned attack types need to be combined with the already known attack vectors such as malware infection, brute-force attack, software vulnerability exploit.

There is a need for organizations to increase even more their focus not only towards the internal ecosystem but also towards the external perimeter. The latter is where are created the greatest interdependencies and interconnections between cloud services (SAAS and PAAS in particular) and third-party managed services that represent an extended attack perimeter.

Many security standards have long been focused on third parties, which is why there has been a shift from a (passive) monitoring/scoring/reporting model to an (active) integrated security and incident response model.

The risks associated with managing cloud services using third parties have a key importance--for companies and corporations--when considering whether to migrate data to the cloud.

Indeed, when choosing to outsource the management of one's data to an external provider, a cost-benefit analysis of the burden of personal data management on the Cloud Service Customer (CSC) must be conducted in order to verify its compliance with the GDPR.

Particular attention should be paid regarding the following macro-steps:

  • Role qualification: aimed at identifying the correct distribution of responsibilities in relation to the processing carried out in the cloud. Specifically, in the case where the CSP is qualified as an Autonomous Data Controller, it will have full responsibility and decision-making capacity over all data received by the CSC. However, it follows that while the CSC will be exempt from any liability in the event of a breach, it will no longer have control over the data transferred within the cloud to its provider. In contrast, should the CSP be appointed as a Data Processor, the CSC will retain its status as an Owner and will have overall responsibility over all data managed by the CSP.
  • Contractual negotiations: the contractual negotiation phase is aimed at preparing specific contractual clauses defining tools to be used in case of transfer of personal data to non-European territories, procedures for communicating personal data to any third parties, methods of communication on the transfer of its data to the data subjects as well as technical and organizational measures to be implemented to ensure data security.
  • Legal and regulatory compliance: in parallel, monitoring proclamations and measures issued by competent domestic and international authorities has become important. Such measures are about the protection of personal data carried out by third parties with a view to implementing timely adjustments with respect to the processing methods carried out.

The contract between CSP and CSC plays a predominant role when establishing the areas of responsibilities of each one by defining shared responsibility metrics

Cloud Computing radically changes the responsibilities distribution by introducing the concept of shared responsibility and mechanisms for implementing and managing governance. Thus, the negotiation phase is a critical moment, and it is central to pay particular attention to Service Level Agreements (SLAs) that specifically guarantee the minimum level of security. These include responsibilities, acceptable performance metrics, a description of the services covered by the agreement, and procedures for monitoring. Metrics and responsibilities between the parties involved in cloud configurations are clearly demarcated, ensuring that cloud service providers meet certain business-level requirements and deliver a clearly defined set of end results to customers. That is why, people working on the CSC’s behalf must be aware of those SLAs. Doing so will prevent a worker from ignoring activities that are within one’s purview that could generate potential vulnerabilities.

The approach

It is now inevitable to follow new security paradigms to give tangibility to the concept of DevSecOps.

Starting first with an assessment such as Business Impact Analysis, the logical and physical/geographical settings will be chosen. In other words, the type of cloud (public, private, hybrid, multicloud) and most importantly the reference regions will be defined.

We need to keep in mind that not all cloud services may be compliant with current regulations.

There are many options that can be considered:

  • Zero-trust architectures, for example, allow the use of micro segmentation by giving the ability to automatically provide strict, granular policies for virtual machines and containers based on the current workload.
  • For native cloud ecosystems, it is possible to implement virtual private cloud (VPC) management policies as each project starts with a default vpc network. Specifically, implement specific policies for Infrastructure as Code cross-systems. The latter is a key component of DevOps as deployment automation in applications cannot work if development and operations teams deploy applications or configure environments in different ways. That is why, it is necessary to keep containers and their data as isolated as possible, since any breaches should never propagate into critical areas of the network, where the most sensitive data persists.
  • The holistic CI/CD pipeline security approach should include CI/CD tools, service dependencies, users, process scripts, code, and any released documentation. This approach aims at eliminating misconfigurations, vulnerabilities, and risks in the CI/CD pipeline process and tools and applies security practices and rules to prevent attacks on the software supply chain. It is possible that in a near future AIdevSecOps, the concept of adding artificial intelligence and machine learning in DevSecOps processes, will be used to detect any false positives and minimize, if not eliminate, manual intervention.
  • Security models such as OWASP Proactive Controls can instead be used to create security practices during the development phase. Such models are aimed at protecting the code and its dependencies in any dev\svil\prod environment. Adopting a Secure Software Development Life Cycle (SSDLC) and a "Security-By-Design" approach allows for the implementation of appropriate security activities throughout all phases of the software life cycle and is necessary to respond effectively to security issues. Therefore, testing methodologies have been defined such as Software Composition Analysis (SCA), Static Application Security Testing (SAST), Dynamic Application Security Testing (DAST) AND Penetration Testing.
  • The least privilege principle should be applied to user and technical accounts (especially API) authorization management. In addition to the security measures to be taken according to Gartner, the Cloud Access Security Broker (CASB) consists of a set of products and services that are useful in identifying a company's security gaps when using cloud services.
  • A CASB therefore stands between the users and the cloud to monitor and verify that the services employed are compliant with corporate security policies, and to intervene if critical issues are detected.
  • A Cloud Access Security Broker (CASB) is thus natively designed for the cloud, as are the applications and resources that it monitors.
  • Its goal is not to replace the traditional IT security infrastructure, but rather to complement it with new functions to manage policies related to all cloud activities, to enhance the IT governance by specifically supporting all control actions related to data, user and application security.

To avoid potential tampering by the CSP, it is necessary to enable via Access Approval explicit approval every time the CSP admin needs to access the contents of the organization's tenants\service.

Application data protection requires the encryption keys and secrets to be managed by the organization that owns the data stored on the Services offered by the CSP so that the privacy of the end consumer is maintained.

Conclusion

In order to avoid vulnerabilities and avoid having an increased attack perimeter because of the principles above-mentioned, security measures must be implemented during the design phase and must not be only considered a mere cost to add at the end of the project. In addition, migrating your services to the cloud requires weighing the risks as much as the opportunities related to costs as well as optimizing operations.


About the Authors

Luigi Belvivere

Graduated in economics and development with a master in economics and intelligence, Luigi Belvivere works for NTT DATA as security advisor. Performs security assessment in the cloud environment. Purpose of the work is to make the ecosystems that migrate and arise on the main Cloud service providers safe.

Elena Minghelli

Elena is a criminologist, graduated at Alma Mater Studiorum Bologna University. She is passionate about IT and Cybersecurity. In NTT DATA since 2019 she has been involved in cybersecurity compliance, with a focus on technical risk assessment activities in On-Premises and Cloud.

Sara Frati

Sara is a Lawyer, graduated at Luiss Guido Carli University in Roma. She has been working in NTT DATA since 2019, providing consultancy activities in the field of cybersecurity with particular reference to data protection, compliance and assurance.

Share this content on your favorite social network today!