Are we ignoring the environmental cost of AI because the outputs are so convenient?
I don’t see this discussed enough outside technical circles.
Everyone talks about what AI can do, but not nearly enough about what it costs to run at scale. Massive training runs, constant inference, data centres, cooling, hardware churn, power demand, and it all adds up.
I’m not trying to be dramatic or anti-tech. I’m genuinely trying to understand the trade-off.
A few things I’d love clarity on:
- How energy-intensive is model training really?
- Is inference at a global scale becoming a bigger issue than training itself?
- Are newer models becoming more efficient, or are gains just being eaten by larger usage?
- And do users or companies actually have enough visibility into this to make informed decisions?
I work in sustainability, so maybe I’m more tuned into this than the average user, but I’m noticing a weird disconnect: people are happy to criticise crypto or aviation on environmental grounds, yet AI often gets a pass because it feels productive and futuristic.
Would love any informed takes, good resources, or even counterarguments if you think the concern is overstated.
SuzanPegs