
As we kick off a new year, many have asked for my predictions around privacy, especially as organizations’ use of AI and cloud-based data processing have skyrocketed of late. My answer centers on what I believe will be a huge change in how privacy and confidential data is perceived.
Public blockchains have long-promised transparency and trust, but their openness has also made them impractical for real-world applications like payroll, identity, and institutional finance; a dilemma I like to call the “transparency paradox.”
Over the following year, however, this trade-off – which I believe to be the single biggest barrier to mainstream adoption – will begin to be solved, and in turn will move confidentiality from a specialist topic to a board-level requirement.
What’s special about 2026?
When it comes to why I believe this shift will happen this year, and not just ‘sometime in the future’, there are many pieces at play. But first off is the tech. Recent, yet significant advances in Fully Homomorphic Encryption (FHE) now finally make confidential blockchains practical at scale.
By allowing computations on encrypted data without exposing sensitive information, we’re effectively seeing the blockchain equivalent of HTTPS, or “HTTPZ”, now materialize.
However, while the tech being ready is certainly necessary, it’s not sufficient on its own. What flips privacy in 2026 is the convergence of several forces at once:
AI tools are moving from experimentation to production: In 2024–2025, many AI deployments were still pilots or limited-scope tools. Now in 2026, AI is already embedded into core workflows, including pricing, decisioning, R&D, legal, healthcare and finance. Once AI touches core IP and regulated data, privacy stops being optional.
Boards becoming directly accountable for AI risk: AI governance is moving from abstract policy to fiduciary responsibility. High-profile failures—data leaks, model inversion attacks, regulatory enforcement, or AI misuse scandals—will act as forcing events. Boards won’t ask “is privacy nice to have?” but “can we prove data never leaked?”
Large buyers resetting procurement standards: As a few major enterprises and public-sector actors set privacy-preserving architectures as default requirements, the market will tip. This is how security certifications and cloud standards became mandatory. Once a handful of large players demand it, the rest of the ecosystem follows quickly.
Competitive pressure, not regulation alone: Importantly, this won’t be driven only by regulation. Companies will see competitors moving faster, accessing better data, and closing deals they can’t. Privacy becomes a revenue enabler, not just a risk control.
That’s why 2026 matters: it’s the point where expectations catch up with capability, and where the cost of not having privacy-by-design becomes visible, measurable, and strategic. But, not all will benefit.
Who’s likely to get left out
The parts of the ecosystem most likely to misread this shift are not the obvious “privacy-hostile” actors, but those who believe they are already doing enough.
This includes platform-led companies that conflate openness with trust. Some players – especially in AI and data platforms – will continue to assume that transparency, open weights, or open APIs are sufficient to generate trust.
They’ll underestimate how quickly customers are learning to distinguish inspectability from confidentiality. Openness can help auditability, but it does nothing to protect sensitive inputs once data is shared. In some cases, it actively destroys privacy.
These companies will be surprised when enterprise buyers push back, not on model quality, but on data exposure risk.
Equally, companies that assume privacy can be retrofitted – treating it as an architectural “layer” they can add later through policy, access controls, or contractual guarantees – will likely find themselves locked out of higher-value use cases (regulated data, cross-entity collaboration, sensitive IP) simply because their foundations aren’t credible.
This in fact mirrors how security was once treated (and still is in many industries). The mistake is that confidential computing, encrypted ML, and privacy-preserving inference fundamentally shape system design. Retrofitting privacy later is expensive, brittle, and often incomplete.
Any vendors who misread today’s relative quiet as lack of demand are too at risk of being left behind. In reality, privacy is often suppressed by demand: buyers don’t ask for what they assume is impossible. As soon as viable solutions exist, expectations reset extremely quickly.
We’ve seen this pattern with cloud security, zero trust, and now AI governance. So, by the time demand is explicit, late movers will already be disqualified.
When privacy becomes table stakes
For those that do take privacy seriously early doors and build it into their products from day one, I see a different story unfolding.
For them, they’ll have access to richer, more sensitive, higher-signal data because customers trust them with it, while others will be stuck training and operating on thinner, sanitized, or synthetic datasets.
They’ll also benefit from speed of deployment, especially in sensitive environments, thanks to fewer legal reviews, bespoke controls, and internal vetoes. When this happens, time-to-value becomes a real differentiator.
Finally, there’ll be a shift in the depth of integration and collaboration, and that’s down to the fact that Privacy-preserving systems unlock collaboration across organizational boundaries (partners, suppliers, jurisdictions) that was previously impossible. This won’t just improve margins, but will drastically expand addressable markets.
We’ve featured the best encryption software.
This article was produced as part of TechRadarPro’s Expert Insights channel where we feature the best and brightest minds in the technology industry today. The views expressed here are those of the author and are not necessarily those of TechRadarPro or Future plc. If you are interested in contributing find out more here: https://www.techradar.com/news/submit-your-story-to-techradar-pro
