The Future of AI’s Energy Appetite

Share

The world typed 365 billion prompts into ChatGPT last year alone. For perspective, Google needed more than a decade to hit that figure. Such scale sparks unease: are we pouring too much energy and water into a digital convenience?

History offers perspective. Vaccines once demanded vast R&D budgets but transformed public health. Synthetic fertilisers consumed energy on a colossal scale, yet they underpinned half the world’s food supply. The same logic applies to AI: the way we design and scale it will determine whether it becomes a climate burden—or a solution.

Concerns about energy are not misplaced. In 2023, data centres powering AI consumed as much energy as entire countries like Germany or France. Yet when broken down, a single ChatGPT-4o reply uses about 0.3 watt-hours—the energy needed to keep a lightbulb glowing for five minutes. To match the emissions of a New York–London flight, you’d need hundreds of thousands of prompts. Even AI’s much-debated water use is modest against everyday benchmarks: tens of thousands of interactions equal the water it takes to make one pair of jeans.

That doesn’t diminish the bigger picture. Aggregate use adds up. Left unchecked, the sector risks embedding a new layer of demand on already strained grids. But technology also carries its own remedy. AI is proving adept at trimming inefficiencies from the systems that matter most: light industry could cut energy use by 8% by 2035 with AI’s optimisation, while transport may see a 20% drop. In the lab, AI is doubling the pace of material discovery, nudging us towards breakthroughs in batteries and clean technologies.

RELATED ARTICLE: Aalo Atomics Closes $100M to Accelerate Modular Nuclear Plants for AI Data Centers

We need to stop thinking of AI as a rogue actor and start recognising it as part of the broader climate and energy system,” says the Bezos Earth Fund, which is investing in systemic upgrades. That means powering data centres with renewables, publishing transparent energy data, designing chips that continue the century-long efficiency race, and treating computing loads as flexibly as the grid itself—shifting demand to off-peak hours, re-using waste heat for district heating.

The precedent is there. The widespread adoption of LED lighting helped flatten U.S. electricity demand in the 2010s, despite population growth and the proliferation of devices. AI could follow a similar arc—if designed with restraint and foresight.

The real question, then, isn’t whether AI should exist. It’s whether we are building it for the world we want to live in.

Follow ESG News on LinkedIn