网曝门

Academics should not feel guilty about AI use’s environmental impact

<网曝门 class="standfirst">Let’s focus concern about AI’s energy and water use where it can make the biggest difference and demand accountability from developers, says Cal Innes
July 31, 2025
Smoke from power station chimneys, illustrating the environmental impact of AI
Source: SteveAllenPhoto/iStock

Lately, I’ve been hearing the same anxious question again and again from university staff: “Is my use of AI contributing to climate change?”

Tools like ChatGPT, Copilot and Gemini are being widely used to get through mounting workloads. Academics are using?AI?to prepare teaching materials. Professional services staff are using it to manage their inboxes. Researchers are using it to summarise journal articles. Yet the growing wave of headlines about and the is making some wonder whether all that labour-saving comes at too high an environmental cost.

I work in digital sustainability for Jisc, a non-profit organisation supporting UK further and higher education. My job is to help colleges and universities understand the environmental impacts of digital tools – the good, the bad and the inconveniently complex.

Let’s start with the good. When used wisely, digital technology can reduce environmental impact. Cloud-based systems, when thoughtfully deployed, are often more energy-efficient than ageing on-site servers. Online collaboration tools can cut unnecessary travel. Smart systems for heating, cooling and lighting have helped institutions measurably reduce energy waste.

网曝门

ADVERTISEMENT

But here’s the uncomfortable reality: our digital lives come with a rising environmental footprint. Every streamed video, email or notification consumes electricity. The recent boom in AI-powered tools adds a new layer to this, embedding additional high-energy processes into everyday routines and placing increased pressure on the infrastructure that powers them.

But this is where the guilt starts to get misplaced.

网曝门

ADVERTISEMENT

An academic using ChatGPT to structure a lecture outline or refine a coursework brief does consume energy. One commonly cited estimate puts it at about 3 watt-hours per query – though experts caution that such figures are only based on rough estimates, largely due to a lack of transparency in data provided by AI companies. Still, in the context of everyday digital activity, the energy use is likely modest – certainly no more than the energy used for a short video call or a few minutes of video streaming. For reference, it to boil a full kettle.

Let me be clear: this isn’t whataboutism, and it’s not about saying “AI is fine because Netflix and Nespresso exist.” I’m not suggesting that individual use of generative AI tools is environmentally irrelevant. But we do need to weigh those tools’ impact proportionally. If we’re not routinely questioning the footprint of watching YouTube videos, doomscrolling social media, or leaving webcams on in online conferences – all of which rely on the same energy-intensive infrastructure – then is it fair to single out those using AI tools simply to manage an overwhelming workload?

And the pressure on those in higher education is very real. A recent sector-wide survey found that less than half of academic staff felt able to comfortably manage their workload, and just 44 per cent felt that their well-being was adequately supported at work. In that context, turning to AI tools to save time isn’t indulgent – it’s a practical response to relentless pressure.

This is where the concept of green shifting comes in – a cousin of greenwashing. Instead of exaggerating an organisation’s environmental claims, green shifting subtly redirects responsibility downwards: from corporations to consumers, from systems to individuals. When the climate burden lands on the shoulders of the individuals using AI – rather than the tech giants building the infrastructure or the governments regulating it – our focus is in the wrong place.

网曝门

ADVERTISEMENT

We should absolutely be concerned about AI’s energy and water use – but let’s focus our concern where it can make the biggest difference. That means demanding accountability from AI developers: transparent energy reporting, cleaner infrastructure and real investment in renewable power. It means governments stepping up with regulation, standards and incentives for greener data centre operations.

It’s also worth recognising that not all AI is created equal. Generative chatbots like ChatGPT represent only a small fraction of total AI-related energy use. The real heavy energy consumption is happening elsewhere: in video generation and analysis, targeted advertising, recommendation engines and training new AI models – not just chatbots but also the large, general-purpose AI systems that are on the horizon.

The substantial environmental threat, then, isn’t the individual user relying on ChatGPT to summarise meeting notes or using Copilot to draft an email. It’s the unchecked expansion of opaque infrastructure, the systemic absence of emissions data and the lack of guard rails to ensure sustainable deployment.

Individual awareness is good. It helps drive informed choices, build pressure for change and hold institutions to account. But individual blame? That misses the point entirely. Guilt won’t fix the problem – but regulation, transparency and system-level accountability just might.

网曝门

ADVERTISEMENT

Cal Innes is a digital sustainability specialist at Jisc.

Register to continue

Why register?

  • Registration is free and only takes a moment
  • Once registered, you can read 3 articles a month
  • Sign up for our newsletter
Please
or
to read this article.
<网曝门 class="pane-title"> Related articles
<网曝门 class="pane-title"> Reader's comments (5)
"Feel guilty"? How can you miss all relevant points? What does "feeling guilty? have to do with anything? All academics [sic] have multiple responsibilities including to the environment! We cannot ignore realties. Or responsibilities. Or .....
"Lately, I’ve been hearing the same anxious question again and again from university staff: “Is my use of AI contributing to climate change?” Funnily enough I have not heard this "anxious" question posed by anyone, let alone "from university staff". I also doubt that the person who wrote this has heard such questioning either. Maybe we can push this one onto to the students, they are supposed to be environmentally conscious these days.
Perhaps it is due to the nature of my role, but I’ve actually had quite a few of these conversations - at talks, workshops, and in more informal chats with university and college staff who have spoken about the climate anxiety they are feeling around their use of AI and tech in general. The point wasn’t to suggest everyone’s worried - just that it’s something I’ve genuinely heard, and that we need to be careful not to shift responsibility away from the systems and infrastructure where the real change needs to happen.
new
Well yes that maybe so if your actual job is "to help colleges and universities understand the environmental impacts of digital tools – the good, the bad and the inconveniently complex". But I would submit to you that this is rather a niche role and, if I am being honest, not the most urgent issue on the mind's of academic colleagues who are rather more concerned about whether or not they will have a job next year. KBO.
This response simply does not follow from the essay itself. Why?
<网曝门 class="pane-title"> Sponsored
<网曝门 class="pane-title"> Featured jobs
See all jobs
ADVERTISEMENT