Some facts about GenAI energy consumption
Generative AI is all about the GPU, so it's worth knowing exactly the power (electricity) required to keep this technology alive; to fuel the curiosity and needs of the users of platforms such as ChatGPT i.e. people like you. We've tried to break this down into the simplest terms possible, to better demonstrate the equivalent amounts of electricity used.
-
The expected number of NVIDIA H100 GPUs to be sold in 2024 is estimated at 3.5Million.
-
Each GPU, at 61% utilisation uses 3,740KWh per annum.
-
Therefore, the total annual electricity consumption for all GPUs in 2024 is:
-
13,090,000,000 KWh​
-
13,090,000 MWh
-
13,090 GWh
-
-
The average household of 2.36 people per household consumes 12,000KWh per annum.​
-
The electricity required to power all GPUs sold in 2024, being utilised at just under two thirds capacity equates to 1,090,833 households
-
A population of 2,574,367, which is the same population of Chicago IL. We hope that's quite a difficult thing to comprehend.
Another way to look at this is the power required per Prompt. i.e. each query entered into ChatGPT, Gemini etc. Research from engineers at Google Brain calculated the below:
​
-
The average Prompt consumes 6.79Wh.
-
There are 180.5 million users of ChatGPT alone.
-
If each of these users entered one Prompt per day that would equate to 446,103,00 KWh per annum.
-
Enough to power 37,175 household per annum (87,734 people)
-
That's just a single application using GenAI compute power. There are many future releases on a similar scale or even larger, i.e. Meta's General Intelligence​
We hope some of these statistics can help you calculate the energy impact you need to consider when utilising Generative AI for your own organisation. Will you buy the compute power as an asset and deploy it in your own data centre? Or will you rent it from a Cloud Service Provider? Either way, you need to do the sums.