Among the arms races taking place in the public cloud is one focused on providing the most trusted environment for hosting applications and data.
It is an area Google’s Nelly Porter is very much focused on. Porter is a director of product management at Google, with responsibilities covering confidential computing and encryption for the Google Cloud Platform (GCP). Confidential computing is one of the techniques GCP uses to secure data.
“Confidential computing is a very interesting term and it’s come from the concept of computing,” Porter explains. “When you’re performing operations on data using an application, confidential computing points to the fact that there are a bunch of technologies built to protect customers’ and users’ privacy.
“It’s privacy-preserving technology that is helping us to keep data and workloads protected when in use, such as when an application performs any operations on that data. This means it has to process it. It has to put it in memory and it has to run computational operations on that data by utilising hardware like CPUs [central processing units], GPUs [graphics processing units] or TPUs [tensor processing units] or any other device.”
It is based on hardware controls built into Google’s infrastructure security. “We’re using the hardware capabilities of our partners like AMD, Intel, or Nvidia to establish very strong cryptographic isolation and protection for our customers’ workloads,” she adds.
The goal is to ensure customers are running their applications in confidential hardware-based environments.
To provide this security, she says, Google needs to make sure AMD, Intel, Nvidia and other hardware providers are doing what they need to do to ensure security is maintained in their products. Equally, Google Cloud has to play its part in securing its cloud infrastructure. “All of these companies have come together to offer incredibly usable, scalable and performant confidential computing for our customers,” she says.
You can never be too secure
A valid question that IT leaders and security chiefs will inevitably ask is how confidential computing fits alongside other initiatives, such as zero trust, secure-by-design and secure-by-default principles. Porter says all such initiatives are built to provide stronger assurances and guarantees when they move workloads to the cloud and store sensitive data to process.
She describes zero trust as “an incredibly interesting and powerful technology” that ensures IT security teams can validate endpoint devices. Given that an endpoint can be a user’s device or a back-end server, for Porter, zero trust, at least in a public cloud environment, offers similar outcomes in terms of IT security to the trust that comes from confidential computing.
“It’s a similar capability, but a completely different implementation, but actually fits into the set of technologies that is used to establish verification of IT environments before you do anything,” she says.
Porter also feels that secure by design or secure by default are closely related to confidential computing, where security technology is embedded directly into IT infrastructure and can be managed through a control pane.
“We’re trying to enable confidential computing globally across every single Google datacentre,” she says. “You check a box and you run confidential computing. It’s what secure by design and secure by default means to us.”
Given the various IT security techniques that can be deployed, there will always be a question of how much is needed to secure the business. Porter says: “I do believe, honestly, that you can never have enough security, and the concept that I always talk about is defence in depth. You can put those technologies together to provide deeper protection for your very important assets.”
But she also believes IT leaders need to carefully consider how and what they need to do and ensure they avoid opening up access and connectivity unless it is needed.
AI can help
Porter believes artificial intelligence (AI) has a significant role to play in confidential computing. “AI is very much on the minds of Google and Google’s security teams. It is also on the minds of our customers, CISOs and security practitioners,” she says.
“When you’re performing operations on data using an application, confidential computing points to the fact that there are a bunch of technologies built to protect customers’ and users’ privacy. It’s privacy-preserving technology that is helping to keep data and workloads protected when in use”
Nelly Porter, Google
For Porter and the IT security community, AI is a very important and useful tool to enable organisations to get more insights into the vast amounts of data that need to be analysed to pinpoint threats. Given that the amount of data accumulating and requiring attention from IT security professionals is growing exponentially, she says: “I strongly believe that AI and AI agents will help us.”
Porter also sees a role for generative AI (GenAI) in helping IT administrators understand the various configurations they need to make when deploying workloads on GCP. “There are multiple things they need to do, and they have to read multiple documents to figure out how to deploy their applications properly and what compliance regulations are needed. A GenAI agent would be able to help,” she says.
Using GenAI this way could, according to Porter, speed up deployments from weeks to minutes and remove all of the unnecessary tasks that IT administrators need to do to figure out what paths to take when deploying workloads onto GCP. She believes GenAI would be useful for such a use case.
Securing GenAI
There are many use cases for GenAI outside of IT security, in helping IT administrators deploy workloads on GCP. But any use of GenAI has security implications.
“Previously, we protected data separately from workloads, which was separate from configuration files,” says Porter. “With Gen AI, everything is dependent. You have applications that are dependent on tonnes of data that you use to train the model. You have the AI model.”
In GenAI, configuration data is the weighting used to tune the model. “With every single piece of data you use – when you do inference or fine-tuning or training – you have to ensure that this data is fully isolated and has privacy-preserving capabilities,” she says.
For Porter and Google, confidential computing is an approach that enables this.