Some facts about AI energy consumption
AI is all about the GPU, so it's worth knowing exactly the power (electricity) required to keep this technology alive; to fuel the curiosity and needs of the users of platforms such as ChatGPT i.e. people like you. We've tried to break this down into the simplest terms possible, to better demonstrate the equivalent amounts of electricity used.
-
NVIDIA's GB300 GPU requires approx. 1.4KWh ~ (12.26MWh/year)
-
By the end of 2025, 150,000 GB300 GPUs were purchased (allegedly)
-
The energy required to power these GPUs is 210MWh ~ (1,840GWh/year)
-
The average UK household consumes 2.7MWh per annum.​
-
That's 681,333 houses, or Stoke-on-Trent.

Another way to look at this is, the power required to answer user Prompts. i.e. each query entered into ChatGPT, Gemini etc. Research from engineers at Google Brain calculated the below:
​
-
The average Prompt consumes 6.79Wh.
-
There are now 800Million active ChatGPT users per week.
-
This equates to 282.5GWh per annum.
-
Enough to power 104,616 households per annum, e.g. The city of Reading in the UK.
-
And of course, there's Gemini, Claude, CoPilot, Apple Intelligence, Meta etc.​

We hope some of these statistics can help you calculate the energy impact you need to consider when utilising AI for your own organisation. Will you buy the compute power as an asset and deploy it in your own data centre? Or will you rent it from a Cloud Service Provider? Either way, you need to do the sums.