Davos 2026
Following this year’s World Economic Forum Annual Meeting in Davos, I dedicated much time to reviewing a great amount of panel discussions and supplementary commentary, which has given me a perspective that would otherwise remain elusive. While Davos isn't a platform for direct policy creation, it works as an evident indicator — illuminating where capital, capabilities, and political will are beginning to align, often preliminary to the crystallization of formal consensus.
This year, that convergence was again unsurprisingly centered on artificial intelligence. However, what distinguished the discussions was a shift in focus. The technologists and builders of these systems devoted much less attention to models; their benchmarks, emergent behaviors, or technological capabilities — and instead prioritized the physical infrastructure that bolsters these systems.
Going further, this dialogue went from contemplating what AI can achieve to scrutinizing what is necessary to sustain it. The question was no longer solely about innovation but about the foundational prerequisites, whether the world is progressing swiftly and prudently enough to meet the demands of this transformative era.
Top industry leaders exemplified this shift. Satya Nadella emphasized the importance of grid capacity and data center expansion timelines over immediate model capabilities. Jensen Huang prioritized supply chain resilience for semiconductors. Meanwhile, Demis Hassabis and Dario Amodei — despite their divergent commercial and philosophical perspectives, repeatedly underscored the aspects of physical infrastructure, energy availability, and the complex, multilayered technological stack imperative for scalable deployment.
Davos 2026 marked the transition: artificial intelligence is no longer constrained by algorithmic progress, but by infrastructure, energy, and geopolitics.
Data Centres and the Shift to Infrastructure
When AI was primarily a research problem, the relevant actors were labs, universities, and a handful of well-capitalised startups. When it becomes an infrastructure problem — the actors have changed. Utilities, sovereign wealth funds, semiconductor foundries, and anyone who controls land, water rights, or permitting authority are now as central to AI's trajectory as the engineers writing the code, or more accurately now refining the code that AI outputs.
The competitive dynamic is no longer about who has the best model; but about who controls the territory those models run on.
Control of compute capacity has become analogous to control of oil reserves in the 20th century
the Requirement of Ample Value Distribution
The second pattern running through Davos 2026 was harder to miss. Most optimistic growth projections were accompanied by an implicit caveat: productivity gains from AI are not just a bonus, but a necessity. The emerging consensus was not just that AI can generate a tremendous surplus in economic value, but that it must do so in ways that are visible, measurable, and broadly distributed. Otherwise, the industry risks losing the societal mandate to continue scaling. Satya Nadella articulated this most directly, nonetheless the concern was broadly raised: if the benefits of AI accrue primarily to technology firms while the costs like energy consumption, environmental strain, and labor displacement — are externalized across communities and public systems, political but mostly public tolerance for continued expansion will continue to erode to much greater levels.
Below is the discussion on AI diffusion, with the CEO of BlackRock (not to be confused with BlackStone, which actually owns QTS — one of the major datacenter operators in the world) Larry Fink asking Satya Nadella, CEO of Microsoft. “Can you describe how this process of diffusion across economies, across companies, across people, and countries? How does that play out?”
“The zeitgeist is a little bit about the admiration for AI in its abstract form or as technology. But I think we, as a global community, have to get to a point where we are using it to do something that changes the outcomes of people and communities and countries and industries,” Nadella said. “Otherwise, I don’t think this makes much sense, right? In fact, I would say we will quickly lose even the social permission to actually take something like energy, which is a scarce resource, and use it to generate these tokens, if these tokens are not improving health outcomes, education outcomes, public sector efficiency, private sector competitiveness across all sectors, small and large. And that, to me, is ultimately the goal.” — Satya Nadella
Nadella continues on with explaining that process of AI diffusion. The aforementioned however are real costs, and the leaders in Davos were, to their credit, not pretending otherwise. On data centres the resource consumption has moved well beyond an environmental footnote. According to the International Energy Agency (IEA), global electricity demand from data centres is on course to more than double by 2030, reaching approximately 945 terawatt-hours — slightly more than Japan's entire current annual electricity consumption. In advanced economies specifically, data centres are projected to account for more than 20% of all electricity demand growth by 2030.
Beyond electricity, the water demands of data centre cooling stemming from the high-performance AI processors present a compounding pressure — particularly acute in regions already navigating freshwater scarcity. AI data centers reportedly use more water than the amount of bottled water people drank globally in one year. Be sceptical however as estimates are difficult to verify as many tech companies do not publicly disclose the specific water usage of their AI operations.
You cannot build megastructures of this scale and simultaneously argue that their environmental and social footprint is a secondary concern. The leaders who understood this at Davos were the ones at least making the more credible case — by insisting that the benefits must grow large and diffuse enough to justify them.
However, the notion of minimizing these costs is something I am intent to keep exploring.
Energy Constraints and the Return of Nuclear Power
The energy problem at Davos 2026 also surfaced a conversation that would have seemed fringe even just a few years ago: nuclear power, and the uranium supply chains that feed it. A dedicated nuclear session drew government representatives from the United States, Czech Republic, India, and the United Kingdom, with the consensus framing a departure from the renewable-first orthodoxy that has dominated energy policy discourse for a decade — nuclear not as a legacy technology to be managed down, but instead as the only dispatchable, low-emissions baseload capable of meeting AI's compounding, around-the-clock power demands.
The corporate sector had already begun acting on that conclusion well before the forum convened. Microsoft's twenty-year power purchase agreement for the restarted Three Mile Island reactor — capable of powering between fifteen and twenty hyperscale data centres at full consumption, had already set the basis, with Meta at the same time announcing the expansion of three nuclear facilities and the reopening of an Illinois reactor.
In the margins of the forum, uranium developers were drawing a parallel to the battery materials deals that preceded the EV boom — arguing that the technology companies now committing hundreds of billions to data centre construction are under a structural obligation to secure the fuel supply that makes those investments viable, and that the upstream uranium market has not yet priced in what that obligation will actually require.
The AI infrastructure race is quietly becoming a uranium story, and the companies that recognize that earliest will have secured a supply-chain advantage that no amount of software optimisation can replicate.
Electrical power, Elon Musk argued, is the single binding limit on AI deployment in the United States, and the trajectory is clear, very soon, the industry will be producing more chips than it can physically turn on. His pointed comparison was China, which is deploying over 100 gigawatts of solar capacity per year — building the energy foundation for AI infrastructure at a pace that US tariff policy and grid inertia are currently making impossible to match.
Technological and Societal Integration
Alongside infrastructure and energy, a third constraint is becoming increasingly visible: governance.
The EU's AI Act moves from partial to broad applicability on August 2, 2026 — less than seven months out. Frontier labs are now operating inside a compression cycle: capabilities in coding, reasoning, and autonomous action are improving faster than institutions can adapt, but regulatory obligations are arriving on fixed deadlines regardless of readiness. The result is a timing problem that no one yet has a clean answer for.
Hassabis and Amodei both alluded to this dynamic, albeit in different ways — capability acceleration shortens the window for societal adjustment, but slowing down isn't an option when competitive dynamics and geopolitical pressure are both pushing the other direction. Safety and governance are no longer abstract principles you address after the technology matures, but they're live constraints you navigate while the technology is still moving.
How can infrastructure be developed amid contested spaces and tightening regulatory environments? What is the locus of value in an increasingly fragmented technological stack and an expanding political landscape? How does one govern a technology that outpaces legislative processes yet is vulnerable to public discontent?
Conclusion
Davos 2026 didn't provide definitive solutions to these pressing issues. Artificial intelligence is entering a phase where its trajectory will be determined not only by algorithmic breakthroughs, but by infrastructure capacity, energy availability, and societal acceptance. Competition is expanding, the resource requirements are intensifying, and the margin for delay is narrowing.
The urgency of these challenges demands immediate and sustained engagement, as the trajectory of increasingly powerful AI systems and societal integration hinges on our collective capacity to respond effectively.
As someone fascinated with artificial intelligence, it is increasingly clear that we are advancing far into uncharted territory, different from anything else in history. Alongside the shift in focus towards AI infrastructure, more powerful AI systems are on the horizon. A moment may arrive during this 5th Industrial Revolution — suddenly and without warning (which do happen in AI research labs), when a breakthrough emerges that reshapes the structure of daily life itself, for better or for worse. History has shown how rapidly the world can transform, as it did during the onset of COVID-19; the trajectory of AI holds the potential for change of equal, if not greater, magnitude. The imperative, then, is preparedness.
While I do not claim to possess all the answers, I understand enough of human nature to recognize that our response to such moments will matter as much, as the technology that precipitates it.
Through artificial intelligence, humanity will attain a velocity once reserved for gods, perhaps without first acquiring the discipline required to wield it.