Inside the AI Economy: What 5 Industry Leaders Revealed at the Milken Global Conference:
Google Cloud’s $460B Backlog: The Massive Gap Between AI Demand and Reality.
AI infrastructure, chip shortages, energy constraints, autonomous systems, and the future of work — decoded by the architects building it all.
Introduction: The People Shaping the AI Supply Chain:
The AI revolution is no longer just a story of software. At the 2025 Milken Global Conference in Beverly Hills, five leaders representing every layer of the AI economy gathered to pull back the curtain on the real challenges — and opportunities — shaping the future of artificial intelligence. From silicon fabrication to autonomous vehicles, from orbital data centers to entirely new AI architectures, the conversation was candid, urgent, and revealing.
The panelists included: Christophe Fouquet (CEO, ASML), Francis deSouza (COO, Google Cloud), Qasar Younis (CEO, Applied Intuition), Dimitry Shevelenko (Chief Business Officer, Perplexity), and Eve Bodnia (Founder, Logical Intelligence).
Together, they painted a vivid picture of where the AI economy stands today — and where its most critical pressure points lie.
The AI Chip Shortage: Why Supply Cannot Keep Up with Demand:
The single most pressing constraint in the AI industry isn't imagination — it's manufacturing. Christophe Fouquet of ASML, the Dutch company that holds a global monopoly on extreme ultraviolet (EUV) lithography machines — the machines that make modern chips possible — was direct: despite massive acceleration in chip manufacturing, he believes the market will remain supply-limited for "the next two, three, maybe five years." That means hyperscalers — Google, Microsoft, Amazon, Meta — will not receive all the AI chips they are paying for, no matter how large the check.
Google Cloud's numbers tell the demand story starkly. Francis deSouza highlighted that Google Cloud revenue crossed $20 billion last quarter, growing 63%, while its committed-but-undelivered backlog nearly doubled in a single quarter — from $250 billion to $460 billion. "The demand is real," deSouza noted with striking composure. The gap between what enterprises want and what the AI infrastructure can deliver has never been wider.
For physical AI companies like Applied Intuition, the bottleneck is different but equally stubborn. CEO Qasar Younis, whose company builds autonomy systems for autonomous vehicles, drones, mining equipment, and defense platforms, said his limiting factor isn't silicon — it's real-world data.
"You have to find it from the real world," he explained. No amount of synthetic simulation fully replaces the unpredictability of physical environments. "There will be a long time before you can fully train models that run on the physical world synthetically." This is the hard truth facing every robotics and autonomous systems company today.
The AI Energy Crisis: From Data Centers to Orbital Infrastructure:
Behind every chip shortage lies an energy crisis waiting to intensify. As AI compute scales exponentially, so does its demand for power — and the industry is running out of easy answers. Francis deSouza revealed that Google is actively exploring data centers in space as a genuine response to energy constraints. The logic: in orbit, you get access to more abundant, uninterrupted solar energy.
The engineering challenges are formidable, but not insurmountable. deSouza noted that space is a vacuum — which eliminates convection cooling entirely. In orbit, radiation becomes the only mechanism for shedding heat, a much slower and harder-to-engineer process than the air and liquid cooling systems today's data centers rely on. Yet Google is treating orbital infrastructure as a legitimate long-term path — a sign of just how serious the energy problem has become.
On the efficiency side, Google's vertically integrated AI stack offers a competitive moat. By co-engineering its custom TPU chips alongside its AI models, Google achieves dramatically better performance-per-watt than companies relying on off-the-shelf components. As deSouza put it: "Running Gemini on TPUs is much more energy efficient than any other configuration," because chip designers know what's coming before the model ships.
ASML's Fouquet added an important economic reality check: "Nothing can be priceless." The AI industry is in a moment of extraordinary capital expenditure driven by strategic necessity. But more compute means more energy — and more energy has a cost that the market will eventually have to reckon with.
Beyond Large Language Models: Energy-Based AI and the Architecture Question:
While the mainstream AI industry debates scale, efficiency, and inference optimization within the large language model (LLM) paradigm, Eve Bodnia is building something categorically different. Her company, Logical Intelligence — whose technical research board is chaired by former Meta chief AI scientist Yann LeCun — is built on energy-based models (EBMs), a class of AI that doesn't predict the next token in a sequence, but instead attempts to understand the underlying rules of data.
Bodnia argues that language itself is simply a user interface — not the substrate of intelligence. "Language is a user interface between my brain and yours. The reasoning itself is not attached to any language." Her largest model runs at just 200 million parameters — a tiny fraction of the hundreds of billions used in leading LLMs — yet she claims it runs thousands of times faster and can update its knowledge dynamically without requiring full retraining.
The practical implications are particularly significant for physical AI applications. "When you drive a car, you're not searching for patterns in any language. You look around you, understand the rules about the world around you, and make a decision." For chip design, robotics, and autonomous systems — domains where systems must grasp physical rules rather than linguistic patterns — Bodnia argues EBMs are the more natural and efficient fit. As the AI field begins to ask whether scaling LLMs alone is sufficient, this alternative architecture is likely to attract growing attention.

The Hidden AI War
Nobody Is Telling You About
Our latest documentary deep-dive into the geopolitical struggle for machine intelligence dominance. Explore the two paths of AI development: open source vs. closed architecture.
AI Agents and Enterprise Trust: The Guardrails That Matter:
Perplexity has quietly evolved from an AI-powered search engine into what it now calls a "digital worker." Dimitry Shevelenko described the vision compellingly: "Every day you wake up and you have a hundred staff on your team. What are you going to do to make the most of it?" Perplexity Computer, the company's newest product, is designed not as a tool a knowledge worker uses, but as a workforce a knowledge worker directs.
But with autonomous AI agents acting inside corporate systems, trust and control become paramount. Shevelenko's answer is granularity: enterprise administrators can specify not just which tools and connectors an agent can access, but whether those permissions are read-only or read-write — a distinction with enormous consequences in live business environments. When Comet, Perplexity's computer-use agent, takes actions on behalf of a user, it presents a plan and requests approval first. Some users find this friction unnecessary. Shevelenko doesn't.
His perspective was sharpened by joining the board of Lazard, the 180-year-old financial advisory firm. There, he found himself unexpectedly sympathetic to the conservative instincts of a Chief Information Security Officer protecting a brand built entirely on client trust. His conclusion: "Granularity is the bedrock of good security hygiene." For enterprises considering AI agent adoption, this framing — control as a feature, not a bug — may be the most important insight of the entire panel.
Physical AI and National Sovereignty: The Geopolitical Dimension:
Physical AI is introducing geopolitical tensions that digital AI never encountered. Applied Intuition's Qasar Younis offered the panel's most geopolitically charged observation: unlike the internet, which spread as American software and faced pushback only at the application layer (ride-sharing apps, delivery platforms), autonomous physical systems manifest inside a nation's borders in ways governments cannot ignore.
The stakes couldn't be clearer. Autonomous vehicles, defense drones, mining equipment, and agricultural machines raise immediate questions about safety, data sovereignty, and foreign control of critical infrastructure.
As Younis put it: "Almost consistently, every country is saying: we don't want this intelligence in a physical form in our borders, controlled by another country." Notably, fewer nations today can field an operational robotaxi than possess nuclear weapons — a striking metric for how early-stage, and how contested, physical AI deployment still is.
On the semiconductor front, ASML's Fouquet provided a measured but significant point about China's AI ambitions. While DeepSeek's release earlier this year rattled parts of the industry, China's progress remains constrained below the model layer.
Without access to EUV lithography machines, Chinese chipmakers cannot manufacture the world's most advanced semiconductors — and models built on older hardware face a compounding performance disadvantage no software innovation can fully overcome. "Today, in the United States, you have the data, you have the computing access, you have the chips, you have the talent. China does a very good job on the top of the stack, but is lacking some elements below," Fouquet said.
AI and the Future of Work: Critical Thinking, Labor, and Human Agency:
The inevitable question came near the end of the session: will AI erode the next generation's capacity for critical thinking? The answers were optimistic — as one might expect from people who have staked their careers on this technology — but they were also substantive.
Google Cloud's deSouza pointed to the scale of problems that more powerful AI tools might finally help humanity solve. Neurological diseases whose biological mechanisms remain poorly understood. Greenhouse gas removal at meaningful scale. Long-deferred upgrades to aging grid infrastructure. "This should unleash us to the next level of creativity," he said — framing AI not as a replacement for human intellect, but as an amplifier of it.
Perplexity's Shevelenko made a more pragmatic, democratizing argument. While entry-level jobs may be disappearing, the ability to launch a business, pursue an idea, or build something independently has never been more accessible. His framing was pointed: "The constraint is your own curiosity and agency." The tools exist. The question is whether individuals will have the initiative to use them.
Applied Intuition's Younis drew perhaps the sharpest distinction of the discussion. The average American farmer is now 58 years old. Labor shortages in mining, long-haul trucking, and agriculture are chronic and deepening — not because wages are insufficient, but because people simply don't want those jobs. In these domains, physical AI isn't displacing willing workers. It's filling a void that demographic trends have already created, and one that will only widen in the decades ahead.
Conclusion: The Real State of the AI Economy:
What emerges from this conversation is a picture of an industry that is simultaneously more constrained and more consequential than most coverage suggests. The AI economy is not limited by ambition or capital — it is limited by chips, by energy, by real-world data, and by the hard physics of building at planetary scale.
**The most important takeaways from Milken 2025: **AI chip supply will remain constrained for years. Energy is the next major bottleneck — possibly requiring orbital data centers. Alternative AI architectures like energy-based models may challenge the LLM paradigm. Enterprise AI agents require granular trust and control frameworks. Physical AI is geopolitically charged in ways digital AI never was. And the labor market impact will be as much about filling gaps as displacing workers.
The architects of the AI economy are building under real constraints. Understanding those constraints is the first step toward understanding where the next opportunities — and the next disruptions — will come from.




