Insights from the 2023 Open Confidential Computing Convention | Azure Weblog and Updates

I had the chance to take part on this yr’s Open Confidential Computing Convention (OC3), hosted by our software program associate, Edgeless Techniques. This yr’s occasion was significantly noteworthy as a result of a panel dialogue on the impact and future of confidential computing. The panel featured among the trade’s most revered know-how leaders together with Greg Lavender, Chief Know-how Officer at Intel, Ian Buck, Vice President of Hyperscale and HPC at NVIDIA, and Mark Papermaster, Chief Know-how Officer at AMD. Felix Schuster, Chief Government Officer at Edgeless Techniques, moderated the panel dialogue, which explored subjects such because the definition of confidential computing, buyer adoption patterns, present challenges, and future developments. The insightful dialogue left a long-lasting impression on me and my colleagues.
In terms of understanding what precisely confidential computing entails, all of it begins with a trusted execution surroundings (TEE) that’s rooted in {hardware}. This TEE protects any code and information positioned inside it, whereas in use in reminiscence, from threats outdoors the enclave. These threats embody all the things from vulnerabilities within the hypervisor and host working system to different cloud tenants and even cloud operators. Along with offering safety for the code and information in reminiscence, the TEE additionally possesses two essential properties. The primary is the power to measure the code contained throughout the enclave. The second property is attestation, which permits the enclave to offer a verified signature that confirms the trustworthiness of what’s held inside it. This characteristic permits software program outdoors of the enclave to ascertain belief with the code inside, permitting for the protected trade of information and keys whereas defending the info from the internet hosting surroundings. This consists of internet hosting working techniques, hypervisors, administration software program and providers, and even the operators of the surroundings.
Relating to what shouldn’t be confidential computing, it’s not different privateness enhancing applied sciences (PETs) like homomorphic encryption or safe multiparty computation. It’s {hardware} rooted, trusted execution environments with attestation.
In Azure, confidential computing is built-in into our total protection in depth technique, which incorporates trusted launch, buyer managed keys, Managed HSM, Microsoft Azure Attestation, and confidential digital machine visitor attestation integration with Microsoft Defender for Cloud.
Buyer adoption patterns
As regards to buyer adoption situations for confidential computing, we see clients throughout regulated industries comparable to the general public sector, healthcare, and monetary providers starting from personal to public cloud migrations and cloud native workloads. One state of affairs that I am actually enthusiastic about is multi-party computations and analytics the place you’ve a number of events bringing their information collectively, in what’s now being known as information clear rooms, to carry out computation on that information and get again insights which are a lot richer than what they might have gotten off their very own information set alone. Confidential computing addresses the regulatory and privateness issues round sharing this delicate information with third events. One in every of my favourite examples of that is within the promoting trade, the place the Royal Bank of Canada (RBC) has arrange a clear room answer the place they take service provider buying information and mix it with their info across the shoppers bank card transactions to get a full image of what the patron is doing. Utilizing these insights, RBC’s bank card retailers can then supply their shopper very exact affords which are tailor-made to them, all with out RBC seeing or revealing any confidential info from the shoppers or the retailers. I imagine that this structure is the way forward for promoting.
One other thrilling multi-party use case is BeeKeeperAI’s utility of confidential computing and machine studying to speed up the event of efficient drug therapies. Till lately, drug researchers have been hampered by inaccessibility of affected person information as a result of strict laws utilized to the sharing of non-public well being info (PHI). Confidential computing removes this bottleneck by guaranteeing that PHI is protected not simply at relaxation and when transmitted, but additionally whereas in use, thus eliminating the necessity for information suppliers to anonymize this information earlier than sharing it with researchers. And it’s not simply the info that confidential computing is defending, but additionally the AI fashions themselves. These fashions could be costly to coach and due to this fact are precious items of mental property that have to be protected.
To permit these precious AI fashions to stay confidential but scale, Azure is collaborating with NVIDIA to deploy confidential graphics processing items (GPUs) on Azure primarily based on NVIDIA H100 Tensor Core GPU.
Present challenges
Relating to the challenges going through confidential computing, they tended to fall into 4 broad classes:
Availability, regional, and throughout providers. Newer applied sciences are in restricted provide or nonetheless in improvement, but Azure has remained a pacesetter in bringing to market providers primarily based on Intel® Software program Guard Extensions (Intel® SGX) and AMD Safe Encrypted Virtualization-Safe Nested Paging (SEV-SNP). We’re the primary main cloud supplier to supply confidential digital machines primarily based on Intel® Trust Domain Extensions (Intel® TDX) and we look ahead to being one of many first cloud suppliers to supply confidential NVIDIA H100 Tensor Core GPUs. We see availability quickly enhancing over the subsequent 12 to 24 months.
Ease of adoption for builders and finish customers. The primary technology of confidential computing providers, primarily based on Intel SGX know-how, required rewriting of code and dealing with varied open supply instruments to make functions confidential computing enabled. Microsoft and our companions have collaborated on these open supply instruments and we now have an lively neighborhood of companions working their Intel SGX options on Azure. The newer technology of confidential digital machines on Azure, utilizing AMD SEV-SNP, a {hardware} safety characteristic enabled by AMD Infinity Gaurd and and Intel TDX, lets customers run off-the-shelf working techniques, raise and shift their delicate workloads, and run them confidentially. We’re additionally utilizing this know-how to supply confidential containers in Azure which permits customers to run their present container photographs confidentially.
Efficiency and interoperability. We have to be sure that confidential computing doesn’t imply slower computing. The problem turns into extra essential with accelerators like GPUs the place the info should be protected because it strikes between the central processing unit (CPU) and the accelerator. Advances on this space will come from continued collaboration with requirements committees such because the PCI-SIG, which has issued the TEE Device Interface Security Protocol (TDISP) for safe PCIe bus communication and the CXL Consortium which has issued the Compute Express Link™ (CXL™) specification for the safe sharing of reminiscence amongst processors. Open supply initiatives like Caliptra which has created the specification, silicon logic, have read-only reminiscence (ROM), and firmware for implementing a Root of Belief for Measurement (RTM) block inside a system on chip (SoC).
Business consciousness. Whereas confidential computing adoption is rising, consciousness amongst IT and safety professionals remains to be low. There’s a super alternative for all confidential computing distributors to collaborate and take part in occasions aimed toward elevating consciousness of this know-how to key decision-makers comparable to CISOs, CIOs, and policymakers. That is particularly related in industries comparable to authorities and different regulated sectors the place the dealing with of extremely delicate information is crucial. By selling the advantages of confidential computing and growing adoption charges, we will set up it as a obligatory requirement for dealing with delicate information. By these efforts, we will work collectively to foster better belief within the cloud and construct a safer and dependable digital ecosystem for all.
The way forward for confidential computing
When the dialogue turned to the way forward for confidential computing, I had the chance to bolster Azure’s imaginative and prescient for the confidential cloud, the place all providers will run in trusted execution environments. As this imaginative and prescient turns into a actuality, confidential computing will not be a specialty characteristic however quite the usual for all computing duties. On this approach, the idea of confidential computing will merely grow to be synonymous with computing itself.
Lastly, all panelists agreed that the largest advances in confidential computing would be the results of trade collaboration.
Microsoft at OC3
Along with the panel dialogue, Microsoft participated in a number of different shows at OC3 that you could be discover of curiosity:
Lastly, I wish to encourage our readers to find out about Greg Lavender’s thoughts on OC3 2023.
All product names, logos, and types talked about above are properties of their respective house owners.