New training course: Responsibly innovating with AI

(NL) Three-day training course "Responsibly innovating with AI"
A 3D geometric cube structure made of transparent glass-like material is interwoven with green  plants.
02.07.2025

Summer Blog - Who’s Afraid of AI? Bias Confirmation on New Technology’s Energy Use

During the summer, the Knowledge Centre Data & Society offers a platform for partner organisations. In this opinion piece, Professor Marlen Komorowski takes a closer look at the numbers and context surrounding the environmental footprint of AI.  She argues that the environmental footprint of AI matters, but that guilt is often placed on individual users instead of addressing the systemic changes needed.

Public discourse around artificial intelligence (AI) is increasingly focused on one alarmist theme: its energy consumption. Headlines warn of AI “guzzling electricity like there’s no tomorrow,” conjuring images of data centers overheating the planet. 

"AI already uses as much energy as a small country." (Vox, 2024) 

 

"Generating an image using a powerful AI model takes as much energy as fully charging your smartphone." (MIT Technology Review, 2024) 

 

"A single ChatGPT prompt consumes nearly 10 times more energy than a Google search." (IEA, 2023) 

Reading these headlines it’s no wonder environmentally conscious users feel uneasy - even guilty - about engaging with these tools. But these factoids, while often rooted in reality: 

  • Leave out the wider energy consumption context;
  • Ignore efficiency improvements;
  • Overlook the share of renewables;
  • And refer to peak scenarios (e.g., training the largest models).

Such headlines feed into a powerful cognitive bias: if something seems too good to be true, there must be a catch. The risk? Environmentally conscious users may shy away from using a technology that has tremendous potential to improve efficiency, productivity, and even sustainability across sectors. 

Echoes of Past Tech Backlashes

If this narrative of “New Tech X is an energy hog, avoid it to save the planet” sounds familiar, it’s because we’ve seen similar waves of alarm with previous innovations. 

In the early 2010s, data centers themselves faced dire predictions. Some studies projected data centers would gobble double-digit percentages of global power by the 2020s. There were front-page stories warning that “the Internet is on track to cause as much CO2 as the airline industry.”  Sound familiar? In reality, those worst-case scenarios didn’t materialize – thanks to huge improvements in efficiency (and a lot of renewable energy, as we’ll see below). 

Blockchain and cryptocurrency are a prime example. A few years ago, headlines abounded about Bitcoin’s colossal electricity usage – sometimes rightly so. At its peak, the Bitcoin network consumed on the order of 100–120 TWh per year, roughly 0.4–0.6% of global electricity (comparable to a small country!). The criticism of Bitcoin’s proof-of-work mechanism was valid, and it spurred change: notably, Ethereum (the second-largest cryptocurrency) switched its design in 2022, cutting its electricity demand by >99%. But in the popular discourse, that nuance often got lost. All blockchain got tarred with the same brush. 

Fear of a tech-powered energy apocalypse has a way of lingering in public imagination 

But, history suggests that early fear-based narratives often oversimplify. Yes, Bitcoin uses a lot of electricity, but that doesn’t mean all digital ledgers do. Yes, some AI models are energy-intensive, but that doesn’t mean all AI usage will overwhelm the grid. 

Importantly, fear can swing public behavior. During the height of the crypto backlash, some environmentally conscious firms might have avoided blockchain projects altogether, even where they might have added value (or used efficient alternatives), simply due to the negative perception. There’s a concern that similarly, AI could be shunned by some on environmental grounds – even in cases where it could increase efficiency or help solve climate challenges (for example, AI optimizing energy grids). We should be careful not to let headline hysteria overshadow the nuanced reality. 

AI’s Energy Use in Context – the Full Picture

Let’s step back and look at hard data on data centers and AI in the broader context of global energy use. First, how big is the footprint of all data centers (the backbone of cloud computing and AI) relative to world electricity consumption? According to the International Energy Agency: 

Data centers (including AI) account "only" for roughly 1–1.5% of global electricity use. 
 

This is about 300–500 TWh out of about 26,000 TWh worldwide. That’s not trivial but it’s far from a dominant share. For comparison, residential air conditioning and cooling devices used about 2,020 TWh in 2016 (over 4× all data centers). Put another way, the energy used by cooling our homes is several times larger than that used to run all the servers powering our internet, AI included. And unlike data center energy (which has been growing only modestly), cooling demand is soaring: the IEA projects that by 2035, air conditioning will require an extra 700 TWh globally. 

Comparisons That Help Perspective

Further to the overall low share of energy consumption of data centers, it can also be helpful to compare what actually takes a lot of energy when we use the internet. 

Article content

As this table shows, video streaming, which accounts for over 60% of internet traffic, consumes far more energy than AI queries. But the public rarely questions the carbon footprint of watching Netflix or YouTube. In fact, AI currently makes up only a fraction of total internet-related energy use. A few AI prompts or image generations consume dramatically less electricity than activities like streaming video or cloud gaming for example. And while AI demand is growing, its relative share remains modest. Without this broader comparison, it’s easy to overestimate AI’s footprint and underestimate the impact of more commonplace digital behaviors even though these should not be compared to make arguments for less digital usage in my opinion. 

The Efficiency Curve: Why Technology Uses Less Energy Over Time

Another critical piece of context is how efficient digital technology has become. The reason data center energy stayed around ~1% of world power for the past decade – even as internet traffic exploded 20-fold – is continuous efficiency gains. Hardware improvements, in particular, are astonishing. The energy needed per computation has plummeted year after year. In the realm of AI, specialised chips (GPUs, TPUs, etc.) have dramatically improved performance-per-watt. In fact, a modern AI accelerator chip in 2023 uses 99% less energy to perform the same number of computations than a chip from 15 years ago. This is an almost 100× efficiency improvement since 2008. 

“More Isn’t Always More: 550% Surge in Data Centers, Only 6% Rise in Energy Use” 

Article content

 

Thanks to such gains, the overall energy trend for data centers has been surprisingly flat in the last decade considering the surge in digital services. In other words, more computing output does not necessarily equal more emissions. Of course, the recent AI boom is adding new demand – we are seeing an uptick as companies race to build AI capacity. Projections vary on how steep the growth will be. The IEA’s latest estimates suggest data center electricity use could double by 2030 in a high-demand scenario, with AI as a key driver. Even so, that would put data centers around maybe 3–4% of global power (still far below, say, the industrial sector or transportation). The key point is that AI’s energy trajectory is not set in stone: it depends on policy, efficiency tech, and usage patterns. 

The share of renewables: greening AI

It’s worth remembering: using more energy doesn’t automatically mean causing more harm to the environment. What matters most is where that energy comes from — and in the case of AI, the growing share of renewables is the real headline. Many hyperscale AI data centers are already running on a high share of renewables. Large cloud providers and AI companies have been among the biggest corporate buyers of renewable energy worldwide. 

According to the IEA, Amazon, Microsoft, Meta, and Google are the four largest purchasers of corporate renewable energy power, together contracting almost 50 GW of clean energy capacity – equivalent to the entire generation capacity of Sweden! By 2021, Google, Meta, Microsoft, and Apple had each purchased or produced enough renewable electricity to match 100% of their data center operations’ needs. Amazon was slightly behind, at 85% in 2021 (30.9 TWh of renewable supply), with a goal of 100% by 2025. What this means is that AI’s big players are largely running on green power, not coal. 

The below graphic shows the percentage of renewable energy in data centers by 2018, which has improved since then drastically. 

Article content

 

Europe in particular has taken a leadership role in greening data center operations. The European Union supports a Climate Neutral Data Centre Pact – an industry pledge to achieve net-zero emissions for data centers by 2030. This pact includes measurable targets for energy efficiency and a commitment to 100% carbon-free energy purchasing within the next few years. In practical terms, a data center in Scandinavia might be powered largely by hydroelectricity or wind, and even ones in fossil-heavy grids often purchase renewable credits or build solar/wind farms to offset their consumption. 

All these factors put AI’s energy use in context: yes, AI training and inference consume significant electricity, but relative to other sectors, it’s still a small piece of the pie – and a piece that is actively being decarbonized and made more efficient each year. 

Shifting Blame vs. Solving the Problem

Why is it important to counter these fear-driven narratives? Because misplaced focus on individual use of technology can divert us from the bigger picture and real solutions

There’s a well-known playbook in corporate PR: when facing criticism over environmental harm, shift the attention to consumer behavior. The classic example is the “Crying Indian” public service announcement from 1971, funded by beverage and packaging companies. That ad showed a tearful Indigenous man lamenting pollution and famously stated, “People start pollution, people can stop it.” The message, while encouraging personal responsibility, subtly blamed individual litterbugs instead of the corporations producing single-use plastics and bottles. It deflected pressure away from industry (no talk of requiring recyclable packaging or producer take-back schemes) and onto consumers’ consciences. Big Oil applied a similar tactic in the climate arena – as highlighted by a New York Times piece titled “Worrying About Your Carbon Footprint Is Exactly What Big Oil Wants You to Do.” In the mid-2000s, BP popularized the concept of the “carbon footprint” calculator, implicitly telling us that climate change is due to our individual choices (driving, flying, heating). Meanwhile, the fossil fuel companies hoped to avoid scrutiny of their own massive contributions. 

We should ask: Is something analogous happening with AI and its energy story? It might not be an intentional campaign by any industry, but the effect is similar. The narrative that “AI is terribly energy-inefficient – think twice before using it” places the burden of guilt on end users. It implies that if you care about the climate, maybe you should stick to traditional (less AI-driven) software, or not use AI features too often, etc. But in reality, whether AI’s growth will be sustainable is a question of infrastructure, innovation and policy, not individual abstinence. If ChatGPT (or any AI service) is run in a data center powered 100% by wind energy, then a million queries have effectively zero carbon emissions – far different from the same queries in a coal-powered data center. The focus should thus be on: greening the compute, not halting the computation

Fear-driven discourse can also become a convenient scapegoat for the major sources of emissions. It’s easier, for instance, for a government or company to blame rising electricity demand on trendy things like AI or crypto than to confront the less sexy truth that, say, inefficient industrial processes are the bigger culprits. 

To be clear, this is not to absolve AI developers and data center operators and policy makers from responsibility – on the contrary, they have the biggest responsibility to ensure their services are efficient and clean. 

But as we’ve seen, many are stepping up, investing in renewables and new cooling tech, because energy costs money. The solution to AI’s energy challenge lies in innovation and policy: continued chip improvements, smarter algorithms that achieve the same results with less computation, waste-heat recycling in data centers, and strong government incentives for renewable energy. 

There’s also a flipside risk to the fear narrative: missing out on the positive environmental applications of AI. AI is not just an energy consumer; it’s also a powerful tool for efficiency. Machine learning models are being used to optimize energy grids (balancing load with renewable supply), to improve building energy management, to accelerate climate research, and to design better materials for energy storage – all reducing emissions in other sectors. The IEA has highlighted that AI can “transform energy systems,” from enhancing renewable integration to speeding up clean energy innovation. 

Conclusion: Productivity, Climate, and a Balanced Path Forward

In the end, being environmentally conscious in the age of AI means demanding facts and solutions, not shunning technology out of fear

So, the next time you see a headline equating ChatGPT with a private jet in terms of carbon footprint, take a pause. Recognize the importance of scale and context: Who powered that computation, and with what energy? 

Rather than letting fear deter us from using AI where it can add value, we should focus on making AI part of the climate solution. This includes supporting policies for clean energy, rewarding companies that are transparent and sustainable, and yes, using AI in smart ways to increase productivity and efficiency in our lives. 

Technology and sustainability are not zero-sum enemies – provided we guide innovation with foresight. As I have argued, fear-based narratives that put the onus on individual restraint (“don’t use AI to save the planet!”) are often distraction tactics that let bigger players off the hook. Let’s not fall for it. By all means, stay informed and demand better from tech companies, but don’t throw away the immense positive potential of AI due to overhyped environmental fears. 

We can embrace impactful, productivity-enhancing tech like AI while also being good stewards of the environment – in fact, through efficiency and clean energy, we must do exactly that. 

About

This article was written with the assistance of ChatGPT 4o Deep Research and sources and information have been fact-checked.

Photo by Google DeepMind on Unsplash

About the author

Marlen Komorowski is a Professor and Senior Researcher at imec-SMIT, Vrije Universiteit Brussel. 

Profile Pic Marlen Komorowski 2

Marlen Komorowski

email hidden; JavaScript is required

Disclaimer

The Knowledge Centre Data & Society is not responsible for the content of the blog posts that appear in our 'Summer' blog series and therefore will not correspond about the content. In case of questions, comments or concerns: contact the author(s)!

Sources

Author

Profile Pic Marlen Komorowski 2

Marlen Komorowski

Professor and Senior Researcher at imec-SMIT, Vrije Universiteit Brussel