Giorgio Natili, former VP of Engineering at OPAQUE Systems, explains how confidential computing will become the standard for protecting data and why leaders must act now.

'By design' is more than code. It's culture. It means dedicating time to training, creating resources, and running hackathons so your teams can experiment. That is how you ingrain the importance of digital dignity into the mind of every person in the organization.
Enterprises are discovering a hard truth about AI deployment: most rulebooks are more like guidelines than architectural roadmaps. Now, the lack of clear standards is creating ambiguity and friction between governance, security, and engineering teams. In response, one emerging best practice is to embed principles like accountability and data dignity directly into AI systems from the start.
The approach is championed by leaders like Giorgio Natili, a software engineering executive with over two decades of experience leading teams at technology giants like Mozilla, Capital One, and Amazon. In his former role as Vice President and Head of Engineering at OPAQUE Systems, Natili was at the frontline of developing platforms for confidential AI. From his perspective, building trustworthy infrastructure will require leaders to move beyond procedural compliance checklists like GDPR and NIST 853, and to embed a new standard of accountability instead.
Sold for parts: The challenge is 'digital dignity,' or the right to own and control your data, Natili said. "The problem is that most people don't even realize they've lost this right. Who owns your data? You really don't know." Most of the time, these details get buried in privacy policies, he explained. As an example, he pointed to the Inflection AI deal, where the company’s assets and its users' data were split and sold. "If your data is treated as an asset in an acquisition, the deal stops being about software and becomes manufacturing. It’s like buying a factory, but you only get half the machines. You have no idea where the other half, your data, ends up."
The solution is confidential computing, Natili said. "Confidential computing protects data at its most vulnerable point: the moment of use. This removes the cloud service provider from the equation by ensuring that not even they can see your data. The first step is simple. Dedicate three or four weeks to a pilot program."
The 95 mph rule: Natili’s advice was a balance of speed and deliberation. "Misjudge, and you either move too slow for the business or too fast for safety. The sweet spot is going 95 mph instead of 100. That 5-mph buffer is where you can have the critical conversations and do the right thing."
The same principle applies to autonomous AI agents, he explained. Because these systems introduce unpredictability, they also demand new standards. To define a minimum livable level of digital dignity for their organization, Natili recommended open-source frameworks like the Model Context Protocol (MCP) and AGENTCY.
An education in exposure: Complicating the technical challenge, however, is a sheer lack of awareness. To illustrate, Natili told a story about students who insisted AI knew nothing about them, until he had them open their chat histories. "My students were shocked by how much could be inferred from their chat histories, including things they never even asked about." Ignoring this awareness gap opens the door to 'shadow AI' and sensitive data leaks, he continued. "As a leader, making this a priority isn't optional."
Instead of waiting for perfect standards, establish a process of continuous improvement from the start, Natili said. "The common startup approach, 'build first, fix compliance later,' is wrong. The right way is continuous experimentation. That means formally reassessing your data sovereignty rules every six to nine months."
The new normal: This future isn't a distant dream, Natili said. Early pioneers faced steep hurdles, but today, the decision to adopt AI no longer requires a massive investment. The technology has officially matured, he explained. "Confidential computing will be the standard of computing in the next 10 years, and the learning curve isn't as steep as people think. An investment today is simply anticipating a future requirement for any major business. This level of security should be considered normal, not extraordinary."
But a technical foundation is just the start for Natili. Digital dignity by design is only effective when it's woven into the company culture, he concluded. "'By design' is more than code. It's culture. It means dedicating time to training, creating resources, and running hackathons so your teams can experiment. That is how you ingrain the importance of digital dignity into the mind of every person in the organization."