Why is it important to counter these fear-driven narratives? Because misplaced focus on individual use of technology can divert us from the bigger picture and real solutions.
There’s a well-known playbook in corporate PR: when facing criticism over environmental harm, shift the attention to consumer behavior. The classic example is the “Crying Indian” public service announcement from 1971, funded by beverage and packaging companies. That ad showed a tearful Indigenous man lamenting pollution and famously stated, “People start pollution, people can stop it.” The message, while encouraging personal responsibility, subtly blamed individual litterbugs instead of the corporations producing single-use plastics and bottles. It deflected pressure away from industry (no talk of requiring recyclable packaging or producer take-back schemes) and onto consumers’ consciences. Big Oil applied a similar tactic in the climate arena – as highlighted by a New York Times piece titled “Worrying About Your Carbon Footprint Is Exactly What Big Oil Wants You to Do.” In the mid-2000s, BP popularized the concept of the “carbon footprint” calculator, implicitly telling us that climate change is due to our individual choices (driving, flying, heating). Meanwhile, the fossil fuel companies hoped to avoid scrutiny of their own massive contributions.
We should ask: Is something analogous happening with AI and its energy story? It might not be an intentional campaign by any industry, but the effect is similar. The narrative that “AI is terribly energy-inefficient – think twice before using it” places the burden of guilt on end users. It implies that if you care about the climate, maybe you should stick to traditional (less AI-driven) software, or not use AI features too often, etc. But in reality, whether AI’s growth will be sustainable is a question of infrastructure, innovation and policy, not individual abstinence. If ChatGPT (or any AI service) is run in a data center powered 100% by wind energy, then a million queries have effectively zero carbon emissions – far different from the same queries in a coal-powered data center. The focus should thus be on: greening the compute, not halting the computation.
Fear-driven discourse can also become a convenient scapegoat for the major sources of emissions. It’s easier, for instance, for a government or company to blame rising electricity demand on trendy things like AI or crypto than to confront the less sexy truth that, say, inefficient industrial processes are the bigger culprits.
To be clear, this is not to absolve AI developers and data center operators and policy makers from responsibility – on the contrary, they have the biggest responsibility to ensure their services are efficient and clean.
But as we’ve seen, many are stepping up, investing in renewables and new cooling tech, because energy costs money. The solution to AI’s energy challenge lies in innovation and policy: continued chip improvements, smarter algorithms that achieve the same results with less computation, waste-heat recycling in data centers, and strong government incentives for renewable energy.
There’s also a flipside risk to the fear narrative: missing out on the positive environmental applications of AI. AI is not just an energy consumer; it’s also a powerful tool for efficiency. Machine learning models are being used to optimize energy grids (balancing load with renewable supply), to improve building energy management, to accelerate climate research, and to design better materials for energy storage – all reducing emissions in other sectors. The IEA has highlighted that AI can “transform energy systems,” from enhancing renewable integration to speeding up clean energy innovation.