The Ecological Gravedigger

When the International Energy Agency (IEA) publishes a report titled "Energy and AI" it should be read with at least mild interest - and a healthy dose of scepticism.

The Ecological Gravedigger
Photo by Immo Wegmann / Unsplash

When the International Energy Agency (IEA) publishes a report titled "Energy and AI" it should be read with at least mild interest - and a healthy dose of scepticism. Still, it's worth sitting down and reading it carefully. After all, in our field, words like "optimization" and "efficiency" are used in every possible grammatical form. Moreover, when there's a promise that something - artificial intelligence, in this case - not only speeds up operations but might even save the planet, it starts to sound like a revolution, or at least a good bedtime story.

The report generally paints a vision in which AI becomes the silent hero of the green transformation. It will predict energy consumption, increase system efficiency, optimize energy grids, and enable better integration of renewable energy sources. Artificial intelligence is supposed to drive efficiency, reduce emissions, and save the world from ecological disaster. It all sounds fantastic. The problem is that this beautiful story of a new ecological era needs to be read through the right filter.

Let’s start with the energy costs of AI itself. The document doesn’t exactly sweep this issue under the rug, but the tone in which it’s presented is telling… Energy consumption by data centres and AI models is rising, but thanks to technological progress, it "could" be more sustainable. That "could" is key. The facts are that training LLMs or GenAI currently consumes energy comparable to the emissions of hundreds of transatlantic flights. In practice, this means that each new, "smarter" model generates more emissions than thousands of people do in a lifetime - people who try to wash yoghurt lids before throwing them away to sort their trash properly.

Secondly, let’s consider the so-called efficiency trap. AI can indeed optimize energy consumption - both in individual server rooms and on the scale of entire cities. However, the history of technology teaches us that every increase in efficiency leads to higher demand. This is known as the Jevons paradox. Processing data becomes cheaper? Let’s process more. Servers get cheaper? Let’s build bigger data farms. Models get faster? Let’s run more queries. Why would it be any different this time? The report doesn’t answer that question - instead, it speaks a lot about "potential".

Thirdly, my favourite topic: responsibility - or rather, the lack thereof. The report rightly points out that AI development must be "sustainable" and "ethical". But few consider how to reconcile that noble vision with the brutal market reality. Can we expect companies investing billions into ever-larger models to voluntarily limit their size and energy consumption?

Of course, there are green AI initiatives, “smaller is better” type projects, model distillation techniques, and attempts to create algorithms more aware of their energy costs. But these efforts are niche, often academic. The main development trajectory is still bigger models, more data, and more GPUs - preferably in the cloud, so we can emotionally distance ourselves from the problem taking place "somewhere out there" in a server room. More coal, more nuclear… more of everything!

An interesting part of the report highlights the role of AI in the energy sector in forecasting demand, managing microgrids, and integrating renewable sources. Here, AI can genuinely do a lot of good because managing the variable output of renewables is a mathematically complex problem, and learning algorithms are perfect tools for it. The issue is that the benefits from this area can easily be offset by the rapid increase in energy consumption elsewhere - for example, in the endless generation of synthetic content, like Studio Ghibli-style images or IT sales rep action figures… which no one truly needs in the end.

The IEA rightly notes that regulatory policy must keep pace with AI development. But as always, the devil is in the details. Who will set these regulations? How do we force companies to measure and report their models’ carbon footprints? How do we limit the race toward even larger neural networks when the market rewards exactly that scale? For now, judging by industry dynamics, it’s like fighting windmills.

In conclusion - artificial intelligence can help save the planet. But it won’t do it automatically, even if developed solely according to market logic - “more power, larger models, faster inference.” The hopes tied to AI are real if we stop treating it as a miracle cure and start seeing it as a tool that can be used well or poorly. The green digital revolution won’t happen just because reports say it will. It will come about when those designing and implementing AI solutions - including us, the IT people - begin to genuinely weigh our decisions not just in gigaflops and dollars, but also in tons of CO₂ emissions. If we don’t, artificial intelligence may well go down in history not as an ecological savior but as an accomplice to accelerated climate suicide.