Confidential Computing Capabilities and Services Will be a Competitive Differentiator for Cloud Providers
Most organizations should have mastered by now the first two pillars of data protection – securing data-at-rest and data-in-transit – across their hybrid enterprise environments. The third data protection pillar – securing data-in-use (i.e., protecting and encrypting data in use while in memory and during computation) – has been elusive, but is in the process of being addressed through the transformational motion commonly referred to as confidential computing.
Technology business leaders are ideally pursuing transformation plans that assume ubiquitous confidential computing availability and data-in-use security will be a cloud-native default within five years.
For many organizations, completing their digital transformation journey has been conditional on being able to categorically ensure that absolutely no one – not a trusted system administrator, the OS developer, the cloud provider, law enforcement, malicious insiders, or an attacker armed with powerful zero-day exploits – can ever secretly access or manipulate the data and intellectual property they entrust to the cloud. Consequently, as the third pillar of data security, confidential computing will increasingly be a prerequisite for any cloud-deployed business application.
The technologies, platforms, and architectures that enable confidential computing have evolved at an astounding pace – especially when compared with the decades it has taken for data-at-rest encryption to evolve from password-protected ZIP files in the early 1990s to today’s enabled-by-default hardware-based encryption locked to the physical compute system, or the continued effort to transition data-in-transit defaults from HTTP to secure HTTPS (preferably using TLS v1.3).
The global pandemic has not held back public cloud advancements and new service offerings in confidential computing. Virtualization infrastructure for confidential computing built atop hardware-based trusted execution environments (TEEs) on servers that implement Intel Software Guard Extensions (Intel SGX) are generally available along with previews of confidential VMs using hardware-based TEE on servers supporting AMD’s secure encrypted virtualization (AMD SEV) extension. In parallel, confidential computing options have begun extending across cloud services to embrace Kubernetes confidential nodes, always encrypted SQL databases, confidential machine learning interfaces, HSM key management, and IoT edge compute.
It can be difficult for security leaders to keep pace with underlying hardware advances and their applicability. For example, the memory integrity protection features of Intel SGX are well suited for highly security-sensitive but small workloads, while AMD SEV is useful for “lift and shift” of existing complex or legacy applications and services without necessarily refactoring existing code. Meanwhile, Intel’s trust domain extensions (Intel TDX) will enable hardware isolated virtual machines (called trust domains), AMD’s secure encrypted virtualization-encrypted state (SEV-ES) will help ensure that guest VMs are encrypted and attestable once they stop running, and many more hardware advances from Intel, AMD, Arm, NVIDIA, etc. that help mitigate new and potentially intrusive memory, compute, and attestation threats will become available via the major cloud providers in the coming year or two.
Clearly, confidential computing is in a transitional period as these new hardware-enabled solutions get adopted and deployed by major cloud providers and their most advanced customers.
While it is easy to get lost in the hardware-based security feature advances the silicon providers are delivering, the long-term assumption security leaders should be planning for is that the physical infrastructure will verifiably guarantee that enclaved processes, memory, and the data they hold or manipulate will be secure from any and all prying eyes – in particular, the cloud and software stack providers – and that all cloud services (from leading public cloud providers) will operate in a secure data-in-use mode by default. It is reasonable to assume that within five years the term “confidential compute” will become superfluous and an assumed native component of all cloud services.
In the meantime, confidential computing capabilities and services will be a competitive differentiator for the large cloud providers.
As the underlying hardware advances and public cloud providers extend data-in-use security, integrity, and attestation capabilities across their customer-available services, business technology leaders will need to assess each cloud service individually and assume some short period of cloud-specific lock-in for the custom applications their own business will engineer.
Opportunities ranging from anti-money-laundering to customer analytics within the financial services industries, privacy-preserving collaborative disease diagnostics and drug development within healthcare services, joint intelligence analysis and anti-corruption across government agencies, etc., are a snippet of newly achievable privacy and confidentiality-preserving solutions from organizations adopting cutting edge confidential computing-enabled cloud solutions.
Some enterprises may be content to wait until data-in-use security becomes ubiquitous to complete their digital transformation. While confidential compute capabilities are expanding and new secure cloud services are “fluid” in their evolution, there is a clear window for technology-agile businesses to innovate and partner with their cloud provider to bring new categories of secure products to market ahead of both competitors and regulators.
-- Gunter Ollmann
First Published: SecurityWeek - November 3, 2020