Strategies for a Soft Landing After the AI Bubble Bursts

0
2

Jan van Boesschoten

Love is in the air
Everywhere I look around
Love is in the air
Every sight and every sound
And I don’t know if I’m being foolish
Don’t know if I’m being wise

As John Paul Young could sense that love was in the air for him in 1977, so too can we feel that the bursting of the AI bubble is brewing, right now, right here. And when we talk about sense and feel, our brains have already reached a conclusion based on hunches, signals broadcast in the media, and half-baked rational thought. So let’s first ground and separate these signals from the noise before we conclude.

The bubble

The Bubble
For a bubble, you need money. Let’s take a look at the deals OpenAI, the AI LLM race’s flagship, has closed with major tech companies since October 2025.

  • Microsoft October 2025
    Investment: $135 billion
    In return: 27% equity stake, $250 billion Azure cloud spending commitment
  • Amazon November 2025
    Investments: $38 billion,
    In return: $38 billion AWS cloud infrastructure contract,
  • Nvidia September 2025
    Investments: up to $100 billion,
    In return: up to $100 billion multi-year GPU procurement
  • Oracle September 2025
    Investment: up to $300 billion,
    In return: up to $300 billion Stargate hyperscale compute contract
  • Broadcom September 2025​
    Investment $10 billion,
    In return $10 billion custom AI chip manufacturing partnership
  • Intel September 2025​
    Investment: $25 billion,
    In return: $25 billion CPU/systems partnership supporting OpenAI datacenter buildout
  • CoreWeave Q1 2025​
    Investment: $11.9 billion
    In return: $11.9 billion in cloud GPU contracts

We see a total investment of about 630 billion (Germany’s budget for 2026) with returns of plus or minus 730 billion. OpenAI’s revenue grew from $4 billion in 2024 to $13 billion by the end of 2025. You can say that is an exponential growth; however, the costs went from 7 billion in 2024 to an estimated 20 billion in 2025. And OpenAI plans to invest 1,4 trillion in long‑term infrastructure deals for compute, energy, and data centres in the upcoming eight years.

As I am not a financial expert, and the deals involved more than simple cash‑for‑equity, I should shut up now. However, it looks to me that John is being robbed to pay Paul here. If you invest 100 billion in a company that will buy your goods for 100 billion, the result is a loss of 100 billion in alternative, potentially more diversified sales. Suppose the company can handle that, fine. However, the investment might be used to leverage other financial obligations instead of buying the agreed assets. That concern is amplified by reporting in the Financial Times that OpenAI is, mildly put, not up to date with its bookkeeping. Besides that, there might not be enough capacity to build the data centres needed to support OpenAI’s appetite for computing power and energy. If it works out, it works out. But there is a likely scenario in which the US tech sector will be hit hard, very hard. Most major tech companies are caught in a hectic tango with OpenAI. When the tech sector takes a hit, the US economy could face severe repercussions, given the tech sector’s outsized market share.

The Magnificent 7 (Apple, Microsoft, Amazon, Nvidia, Alphabet, Meta, and Tesla) are the top 7 companies in the S&P 500 with a combined market capitalisation of 20,8 trillion. They account for roughly 72% of the US stock market value. Together, they invested an estimated 360 billion in AI in 2025 only (Apple: $5 billion, Microsoft: $88 billion, Amazon: $100 billion, Nvidia: $16 billion, Alphabet: $75 billion, Meta: $72 billion, Tesla: $4 billion). Although it is only 1,7% of their combined value, it is a huge single-year bet, roughly 80% of their total estimated 2025 income. Overall, financial structures are complex and have led to bubbles bursting, as seen in the 2008 financial crisis. So it is safe to say: “Houston, we might have a problem.”

Might have, because this ever-accelerating train might just change course in time before it plunges into the abyss. But it could be that the hype and expectations don’t meet business reality. Adaptation in the AI industry will be slower than expected. New data centres, infrastructure, and energy are becoming increasingly scarce. When those bottlenecks arise, a tidal wave of trouble will follow. And like with every wave, when you catch it at the right time, it will carry you safely to the shore. How?

Soft landing

One thing is sure: AI is here to stay, with or without an AI bubble bursting. The dot-com crash in 2000 didn’t wipe out the internet. So, the AI bubble burst will not wipe out AI. Bracing for impact is recommended. If we peel back the layers, AI is essentially data, compute power, algorithms, and transport. Today, AI is almost synonymous with large language models: a conversational interface to compute power trained on internet-scale data.

Yet only about 5% of the world’s digital data is accessible or shared; most of it sits in walled gardens, waiting for the gate to open. Why that gate remains closed is not the focus of this post. For now, AI is used mainly in office settings to generate text, code, presentations, images, and short videos. Search is another major use case, especially with the release of Google’s Gemini 3 and its deep integration into search.

Assistant‑style LLMs have now found a place in day‑to‑day life as the interface layer to AI. That, in turn, clears the way for the most troubling route to making AI commercially vital: advertising — especially now that Google is fully integrating Gemini 3 into search. If that trend continues, you will want to limit the role of these ad‑incentivised systems in your core workflows and data pipelines. It is crucial to remember that the conversational interface’s answers to your data are driven by statistical pattern‑matching — and, in ad‑funded systems, by advertising and engagement objectives, not by any neutral notion of ‘truth’.

In that context, investing in and working with open‑source or open‑weight LLMs, such as Mistral, is a strong alternative to relying solely on commercial models. Also, from a European sovereignty perspective, it makes sense to build on and use Mistral and other European or community‑driven models in any case. When the current bubble bursts, community‑supported LLMs are also likely to offer more continuity than products from a distressed company facing lawsuits and angry investors, because the investment conditions that sustained them were no longer met.

Effort not wasted in the turmoil of an AI market shake‑out is effort spent structuring, organising, and labelling your own data. Although LLMs are very good at extracting value from unstructured information, real leverage comes when carefully selected data sources are deliberately linked with other datasets to uncover hidden patterns and relationships — for instance, using methods such as regression, clustering, or graph‑based analysis.

And last but not least: take the lessons to heart learnt by the release of DeepSeek in January 2025 DeepSeek showed that a relatively low-cost, open-weight model could rival or surpass expensive frontier systems in reasoning and coding. DeepSeek forced many to rethink the assumption that state-of-the-art AI could only be achieved through massive investments in capital, data, computational power, and infrastructure. That means it is worth stopping, looking back, rethinking, redesigning, and only then moving forward; there is often more wisdom in past cycles and hard‑won experience than in today’s hopes for the world’s AI dominance.

Jan van Boesschoten

Vorig artikelUK government outlines next wave of AI investment plans
Volgend artikelGoogle wins multimillion-pound contract to supply sovereign cloud services to Nato