Electrical energy
1,200-1,800 MtCO₂
Solutions
AI has a real environmental footprint — but it can also be the most powerful tool against the climate crisis. The data is more nuanced than it appears.
According to BCG and Google, every tonne of CO₂ emitted by AI could avoid between 3 and 10 tonnes in other sectors. The ratio depends on how and where it's applied.
AI will emit ~100 MtCO₂ by 2030, but could avoid between 2.6 and 5.3 Gt — a ratio of x 4 to x 10. That potential is already backed by real cases: DeepMind reduced data center cooling consumption by 40 %, and wind prediction models increase the value of renewable energy by 20 %. The ratio only materializes under three conditions: application in high-impact sectors, clean energy for the AI systems themselves, and governance that prevents the rebound effect.
Estimates of emissions reduction by applying AI per sector through 2030
+ 20 %
Additional value in wind energy with AI prediction (DeepMind)
40 %
Cooling energy reduction in Google data centers
x 4
Reduction ratio vs own emissions (BCG estimate)
345 MtCO₂
Reduction already documented in energy sector 2024
Google’s data centers emitted 48% more CO₂ in 2024 than in 2023.
That’s a real figure. It’s in their own sustainability report. And it’s the kind of number that fuels headlines about AI as a climate threat.
But there’s another side to that equation — one that almost never appears in the same article.
In 2024, BCG published one of the most rigorous analyses of the net climate impact of artificial intelligence. The conclusion was striking in its magnitude: AI could reduce between 2.6 and 5.3 gigatonnes of CO₂ annually by 2030.
How much will AI itself emit by that year? The most conservative estimates point to ~100 megatonnes. The most pessimistic, 300-500 Mt if the growth in compute demand is not matched by renewable energy.
Let’s do the math:
| Scenario | AI Emissions (2030) | Potential Reduction | Ratio |
|---|---|---|---|
| Conservative | 500 MtCO₂ | 2,600 MtCO₂ | x5 |
| Central | 300 MtCO₂ | 3,900 MtCO₂ | x13 |
| Optimistic | 100 MtCO₂ | 5,300 MtCO₂ | x53 |
The ratio is not x4 or x10 in absolute terms: it depends critically on where the AI runs and what it’s applied to.
That’s the sentence missing from most debates.
The energy sector accounts for between 1,200 and 1,800 MtCO₂ in potential annual reduction according to estimates from BCG and Nature Climate Change. It is, by far, the category with the greatest leverage.
The data is no longer projections — it’s measured results.
In 2016, DeepMind trained a reinforcement learning system to control the cooling of Google’s data centers. The system took hundreds of variables — temperature, pressure, flow rates, weather — and learned to optimize them in real time.
Result: 40% reduction in cooling energy consumption.
This is not an estimate. It’s an audited result, published with verifiable code and data, that Google has maintained in production ever since. Cooling typically accounts for 30% to 40% of a data center’s total consumption, so the real impact on the center’s total consumption is approximately a 15% overall reduction.
The fundamental problem with wind energy isn’t generating it — it’s predicting it. A turbine that produces variable energy has much less market value than one that can commit to delivering a specific volume within a specific time window.
DeepMind developed a prediction model for 700 MW of wind capacity in the U.S. The system predicts production 36 hours in advance, allowing operators to commit to firm delivery contracts.
Result: +20% economic value for the same energy generated. No more panels, no more turbines. Just better-managed information.
The implication is direct: if wind energy is worth more, more wind energy gets built. AI as a profitability multiplier for renewables has systemic effects that go beyond the individual use case.
Transportation represents approximately 16% of global emissions. And unlike the energy sector, AI use cases in transportation have been producing measurable results for over a decade.
UPS Project ORION is perhaps the most cited and most verifiable example. Before the era of large language models, UPS developed a route optimization system that analyzed millions of possible combinations for each driver.
The result, documented in their annual reports: savings of over 100 million miles driven per year. That’s equivalent to approximately 100,000 tonnes of CO₂ avoided annually — just from UPS, just from route optimization.
Other documented impact vectors:
Steel, cement, chemicals. These are the three industrial sectors that concentrate the largest share of emissions that are difficult to eliminate with simple renewable energy sources. They require very high-temperature heat, chemical processes that emit CO₂ as an intrinsic byproduct, or both.
AI doesn’t solve these problems at their root — but it can compress them.
In November 2023, Google DeepMind published the results of the GNoME project (Graph Networks for Materials Exploration): 2.2 million new stable materials discovered through machine learning on molecular graphs.
For context: materials science had catalogued approximately 50,000 stable materials over the past 50 years of experimental research. GNoME multiplied that catalogue by 44 in a single project.
Among those materials are candidates for next-generation batteries, superconductors, and catalysts for low-emission industrial processes. Not all will reach production — but even if 0.1% are useful, that represents 2,200 new materials with industrial potential.
An electric arc furnace in a steel mill consumes between 350 and 400 kWh per tonne of steel. When it fails unexpectedly, the restart process consumes 2 to 4 times more energy than a normal cycle during the first hours.
AI-based predictive maintenance systems — which analyze vibration, temperature, current, and usage patterns — can reduce unplanned shutdowns by 30-50%. In a mid-sized steel mill producing 500,000 tonnes per year, that can be equivalent to 10,000-20,000 tonnes of CO₂ avoided per year.
Agriculture and land use represent approximately 22% of global emissions when deforestation is included. And it’s one of the sectors where AI has the most diverse vectors of action.
Irrigation optimization: AI systems that combine weather data, soil moisture sensors, and satellite imagery are demonstrating reductions of 30-50% in water use in pilot projects in California, Spain, and Israel. The climate benefit is twofold: less energy for pumping, and less stress on aquifers that regulate the local hydrological cycle.
Early pest detection: Continuous crop monitoring with high-resolution satellite imagery, combined with computer vision models, enables detecting infestations in their early stages — when the treatment needed is a fraction of what a full-blown infestation would require. Pilots in coffee plantations in Colombia and maize fields in Kenya have demonstrated reductions of 40-60% in pesticide use.
Carbon sequestration measurement: One of the structural problems in the forest carbon credit market is the lack of objective verification. ML models trained on multispectral satellite imagery can estimate forest biomass — and therefore stored carbon — with accuracy comparable to field inventories, but at global scale and near-zero marginal cost.
The best-known case is AlphaFold.
In 2022, DeepMind published the predicted structures of 200 million proteins — practically the entire known proteome of life on Earth. Work that would have required decades of X-ray crystallography became available in a public, free database in 18 months of compute.
The implications for climate are indirect but real: research into fourth-generation biofuels, enzymes that degrade plastics, and biological catalysts for industrial processes has accelerated measurably. Discovery cycles that used to take 10-15 years are now, in some cases, compressed to 3-5.
GraphCast, Google’s weather prediction model, operates at 10,000 times the speed of the traditional numerical models from the European Centre for Medium-Range Weather Forecasts (ECMWF), with comparable or superior accuracy on most parameters. More accurate weather forecasts don’t just save lives during extreme events — they also optimize the management of electrical grids with high renewable penetration.
Here the article shifts in tone. Because there’s an enormous gap between the documented potential and what’s actually happening.
“AI for climate” has become a marketing argument.
80% of projects presented as “green AI” in the ESG reports of large corporations are, in reality, internal process optimization — operational cost reductions that also reduce consumption, but that would have happened anyway with or without an explicit climate agenda.
The x4 or x10 ratio is only real under three conditions that are rarely met simultaneously:
1. AI is applied to genuinely high-impact use cases, not cosmetic optimizations.
There’s a difference between training a model to predict when to turn on office lights (marginal impact) and training a model to optimize energy dispatch on a regional electrical grid with 40% wind penetration (systemic impact). Both count the same in many reports.
2. The energy powering the AI comes from renewable sources.
An AI trained on coal to then optimize a renewable grid has a worse net ratio than what appears in standard accounting. Compute has a geography — and that geography matters.
3. The savings are not neutralized by the rebound effect.
If AI reduces the cost of producing a good by 30%, demand for that good may increase enough that total consumption rises. This is the Jevons paradox applied to AI-driven efficiency. It’s not hypothetical: there’s preliminary evidence that efficiency gains in AI data center energy use have been partially offset by the explosion in demand for AI services.
Google’s 2024 environmental report says it plainly: emissions grew 48% year-over-year, driven by the energy consumption of AI infrastructure — while their own optimization systems were already in production. The benefit and the cost coexist within the same company.
There’s a structural paradox in the current debate.
AI can be the most powerful tool for accelerating the energy transition. But if that same AI is powered by coal or gas, its own footprint grows faster than the benefits it generates in other sectors.
The big tech companies know this. Their responses are revealing:
But renewable energy contracts are not the same as real-time renewable energy. A data center that signs a PPA (Power Purchase Agreement) with a wind farm still consumes the grid mix during moments when the wind isn’t blowing. Decarbonizing AI compute requires storage and dispatch solutions that don’t yet exist at the necessary scale.
AI is neither inherently green nor inherently destructive. It’s an amplification tool. It amplifies what we’re already doing: if we point it at the climate problem with rigor, it can be the most powerful tool we have. If we use it to do more of the same, it only accelerates the problem.
If you’re a company or run operations: The financial and climate arguments converge: optimizing industrial consumption with AI reduces costs and emissions simultaneously. The first step isn’t an AI project — it’s having the data. Investment in instrumentation and telemetry is the prerequisite for any optimization system to have something to learn from.
If you work in energy or industry: The sectors with the greatest potential are electrical generation, heavy industry, and freight transportation. The use cases are documented and the models are accessible. The real barrier usually isn’t the technology — it’s data governance and organizational resistance to algorithmic control of critical systems.
If you’re an investor or ESG fund manager: The emissions reduction ratio per euro invested in AI applied to energy or industry exceeds other green investment categories. But it requires specific due diligence: what is the application sector? What is the reduction baseline? Is the AI infrastructure itself powered by renewable energy?
If you’re a citizen or climate activist: Both narratives — “AI is destroying the climate” and “AI will save the climate” — are equally incomplete. Demand the conditional version: AI can reduce emissions x4 if it’s applied in high-impact sectors, if it uses clean energy, if it’s governed to prevent the rebound effect.
Google’s 48% increase in emissions in 2024 is real. And so is the 40% reduction in cooling consumption that DeepMind achieved with reinforcement learning. Both data points coexist. Understanding why — and under what conditions one can outweigh the other — is exactly the kind of thinking the climate crisis demands.
The data doesn’t say AI is good or bad for the climate. It says it can be decisive if we know where to point it.
Related
La guía definitiva del consumo energético por modelo y modalidad en 2026
Manifiesto AISHA: por qué defendemos la inteligencia artificial y por qué exigimos que se use de forma responsable
Subscribe to the AISHA editorial newsletter to stay up to date with new pieces, reports and tools.
Go to newsletter