For more than a decade, multi-cloud has been sold as an escape hatch.
Freedom from lock-in.
Resilience through diversity.
Leverage over hyperscalers.
On paper, it made sense. In practice, many organisations quietly discovered that spreading workloads across multiple clouds didn’t reduce risk it relocated it.
In a recent Smarter Strategic Thinking conversation that included perspectives from Vawlt, the discussion focused less on cloud choice and more on an uncomfortable reality: data strategy never caught up with cloud ambition.
Cloud platforms excel at abstraction. Storage, compute, and services are neatly packaged, easy to deploy, and deceptively simple to scale.
What they don’t solve is coordination.
As organisations adopt multiple providers, data quickly becomes fragmented across:
Each environment works — until data needs to move.
At that moment, multi-cloud stops being strategic and starts becoming expensive.
Cloud storage costs are often framed around price per terabyte. That framing misses the real issue.
The costs that destabilise budgets aren’t storage — they’re movement:
These costs aren’t always visible upfront. They emerge gradually, often triggered by backup, analytics, compliance, or recovery workflows that span providers.
By the time organisations notice, they’re already locked into behaviours that are difficult to unwind.
Curious about the product or just want to know more. We are one click away.
One of the more telling observations in the discussion was that cloud hasn’t eliminated data gravity. It’s intensified it.
As datasets grow:
The irony of multi-cloud is that while compute is increasingly portable, data remains stubbornly anchored.
This creates a paradox: organisations adopt multi-cloud for flexibility, but end up constrained by the very data they were trying to free.
Security models often assume clear boundaries: trusted environments, controlled access paths, defined ownership.
Multi-cloud erodes those assumptions.
Data replicated across providers introduces:
When breaches occur, understanding what data was exposed and where becomes a forensic exercise rather than a known state.
The conversation highlighted that security failures in multi-cloud environments are rarely caused by a single weakness. They emerge from loss of visibility across fragmented systems.
Despite years of decentralisation, there’s a noticeable shift underway.
Not toward single-cloud but toward centralised data control layered above cloud infrastructure.
This doesn’t mean abandoning cloud providers. It means decoupling data governance, resilience, and cost control from the underlying platforms.
In this model:
It’s a subtle shift but a consequential one.
The most important insight from the discussion wasn’t about technology. It was about framing.
The question is no longer:
“Which clouds should we use?”
It’s now:
“How do we control data across clouds without inheriting their complexity?”
That shift reframes multi-cloud from a procurement decision into an architectural one and exposes why many early implementations are being quietly reworked.
Multi-cloud isn’t going away. But its role is changing.
Organisations that succeed won’t be those with the most providers, the lowest list prices, or the widest service portfolios.
They’ll be the ones that:
The future of multi-cloud won’t be defined by cloud platforms themselves but by how effectively organisations regain control over the data that flows between them.
This article draws from a discussion on the Smarter Strategic Thinking podcast, where Fortuna Data examines how data architecture, risk, and cost are evolving beneath modern cloud strategies.
This article is based on the full discussion with Panzura on the Smarter Strategic Thinking podcast.