Abstract illustration of a semi-open corporate safe with energy data hidden inside, in cyan and amber tones on a dark background

Why Don't AI Companies Want You to Know How Much Energy You Consume?

The 6 economic incentives that explain why energy opacity isn't an oversight — it's a strategy

By AISHA · March 5, 2026 · 7 min read

151 million euros per year. That's what the tech industry spends on lobbying in Brussels — 55 % more than in 2021. And their number one priority is preventing AI energy regulation from moving forward.

AI companies don't publish consumption data because doing so goes against their economic interests. There are 6 structural reasons: trade secrecy, regulatory evasion, ESG risk, narrative control, unfavorable comparisons, and legal liability for Scope 3 emissions. It's not an oversight — it's a Nash equilibrium where no player has an incentive to move first.

€ 151 M

Annual tech lobbying in Brussels

+ 55 %

Increase in lobbying spending since 2021

890

Full-time tech industry lobbyists in the EU

6

Economic incentives to maintain opacity

Reported emissions increase by major AI providers

Google (2019-2023)

48 %

Microsoft (since 2020)

23 %

Anthropic

0 data

OpenAI

0 data

xAI

0 data

151 million euros per year. That’s what the tech industry spends on lobbying in the European Union — 55% more than in 2021. With 890 full-time lobbyists pressing in Brussels, the priority is clear: prevent any energy transparency regulation from advancing before the market consolidates.

This isn’t a conspiracy theory. It’s incentive economics. And it explains why, in an industry that moves more than half a trillion dollars annually in infrastructure, almost no one publishes how much energy what they sell actually consumes.


It’s not an oversight. It’s a Nash equilibrium

AI’s energy opacity isn’t a slip-up, nor a matter of “we’ll get to it when we have the data.” Providers already measure their consumption internally — they need it to size infrastructure, negotiate electricity contracts, and optimize costs.

The question isn’t whether they can publish the data. It’s why they choose not to.

The answer has six layers, and each one reinforces the others.


Incentive 1: Trade secrecy

Energy consumption data indirectly reveals model architecture and efficiency. If OpenAI published the real Wh of GPT-5.4 with reasoning, competitors could infer how many internal tokens it generates, how many layers it activates, and how its system scales.

The law protects this silence: 18 U.S.C. § 1905 in the U.S. penalizes the disclosure of trade secrets by public officials, and companies argue that their efficiency metrics are intellectual property. California tried to change this with AB-2013, but the scope remained limited.

The result: every company has a legal and competitive incentive not to be the first to publish.


Incentive 2: Regulatory evasion

If consumption data were public, regulation would be inevitable. And regulation has a cost.

The EU AI Act already includes energy transparency requirements in its Article 40, but the standards won’t be binding until August 2028. The European Digital Omnibus — presented as “reducing administrative burden” — has managed to delay the application of high-risk provisions until December 2027 and marginalize energy metrics.

In the United States, the situation is worse: more than 300 state legislative proposals on AI and no federal law. The “America’s AI Action Plan” prioritizes commercial and military competitiveness, not environmental impact.

While European regulation is delayed and American regulation doesn’t exist, companies operate in a perfect vacuum: no obligation to measure and no consequences for not doing so.


Incentive 3: ESG risk

There are more than 30 trillion dollars managed under ESG (Environmental, Social, Governance) criteria worldwide. Rating agencies like MSCI and Sustainalytics already evaluate tech companies on their emissions.

Publishing detailed consumption data per service would expose AI companies to scrutiny they’d prefer to avoid:

  • Google acknowledged that its total emissions grew 48% between 2019 and 2023, but attributes much of it to market-based accounting that uses clean energy certificates.

  • Microsoft admitted a 23.4% increase in emissions since 2020, with 97% concentrated in Scope 3 — the supply chain and use of its products.

  • OpenAI scored 23 out of 100 on the DitchCarbon environmental index.

If providers published real Wh per query, per image, per generated video, the “green AI” narrative would crumble within weeks. And with it, the valuation of companies that are publicly traded or aspire to be.


Incentive 4: Narrative control

The industry has learned from other sectors: once a visible consumption label exists, the consumer compares. And comparison changes purchasing behavior.

Imagine that every time you generate an image with Midjourney you saw: “This image consumed 3.2 Wh — equivalent to having an LED bulb on for 19 minutes.” Or that when generating a video with Runway: “This 10-second clip consumed 260 Wh — equivalent to charging your smartphone 18 times.”

The calorie analogy is exact. When the food industry was required to label calories, it didn’t change the composition of food immediately. But it changed what people chose to buy. And that changed what manufacturers produced.

AI needs its calorie label. And the companies know it.


Incentive 5: Unfavorable comparisons

If everyone published data, it would become clear who is efficient and who isn’t. And the difference is not marginal.

The data that does exist already shows brutal disparities:

  • Gemini 2.5 Flash-Lite can respond to a text query with 0.05–0.12 Wh
  • GPT-5.4 with active reasoning consumes 4–18 Wh for the same task
  • DeepSeek-V3.2 operates at 0.08–0.18 Wh thanks to its MoE architecture

The difference between the most efficient and least efficient model for equivalent tasks can be x100 or more. Publishing this data would force less efficient providers to justify their prices — or improve.

DeepSeek demonstrated that a competitive model can be offered at $0.028 per million tokens. If it were also published that it consumes a fraction of its competitors’ energy, the competitive pressure would be unsustainable for those charging x50 more.


This is the incentive that worries legal departments the most. The European CSRD (Corporate Sustainability Reporting Directive) requires companies to report Scope 3 emissions — those generated in their value chain, including the services they contract.

If a European company uses ChatGPT for its daily operations and OpenAI published how many Wh each query consumes, that company would have to include that consumption in its sustainability report. And if that consumption turns out to be significant, it would face pressure to seek more efficient alternatives.

AI providers know this. Publishing consumption data would trigger a cascade of legal liability along the entire value chain. It’s more convenient to maintain the void.


The Memphis case: opacity with real consequences

If the economic incentives seem abstract, Memphis is the concrete case.

xAI — Elon Musk’s AI company — installed its Colossus campus in Memphis, Tennessee, with 35 gas turbines operating without the required environmental permits. The result:

  • 1,200–2,000 tons of NOx emitted per year
  • 1 million gallons of water consumed per day
  • $30–44 million in estimated annual health damages to the Boxtown community
  • The NAACP and the Southern Environmental Law Center have filed a lawsuit

The EPA intervened in January 2026, but the turbines continue operating. The affected community is predominantly African American and low-income.

Energy opacity isn’t just a data problem. It’s an environmental justice problem. When there’s no transparency, the consequences are paid by those with the least capacity to demand accountability.


Google proved it can be done

It’s worth repeating: Google published its figure of 0.24 Wh per median Gemini query. It did so with reviewable methodology, in a public paper, with replicable data.

It didn’t lose market share. It didn’t lose investors. It didn’t trigger a regulatory crisis.

What it demonstrated is that transparency is viable and that the argument that publishing data would harm business is false. What harms is opacity — because it erodes trust and delays regulation that will come eventually anyway.

The rest of the industry can do the same. It chooses not to.


What can I do?

  • If you’re a user: Choose providers that publish consumption data. Google and open source models measured by Hugging Face are the most transparent options today. When you use opaque services, do so knowing that you’re consuming an amount of energy that no one wants you to know about.

  • If you run a company: Your obligation under CSRD includes Scope 3. Demand an energy consumption datasheet per service from your AI providers. If they don’t provide one, document it — it will be relevant in your ESG audit.

  • If you’re a developer: Prioritize models with public efficiency data. Use flash/mini/lite for routine tasks. Every call to a frontier model without necessity is an unjustified energy cost that you can’t measure.

  • If you work in regulation: Article 40 of the EU AI Act is a start, but 2028 is too late. The measurement technology exists today. What’s missing is political will — and the pressure from 890 lobbyists working to ensure that will never arrives.

Sources

Related

Keep exploring AISHA

Next step

Don't miss any update.

Subscribe to the AISHA editorial newsletter to stay up to date with new pieces, reports and tools.

Go to newsletter