Reducing the footprint

If you’re looking to reduce your energy footprint, here are some tips.

Use specialized models

If you need a simple answer, use a web search engine like Google; don’t use ChatGPT for that. If you want to translate text, use a translation app like Google Translate or DeepL Translator. Instead of generating an image, which uses 60 times more energy than text, use existing stock images.

Prefer traditional web search engines

They consume less energy per query. A query on Google consumes 0.3 Wh1. This is 10 times less than a query on ChatGPT. In addition, traditional search engines like Bing or Google provide many more results to compare. This will help you remain critical of the results produced by any AI, which tries to limit its external references to 3 or 4 links.

Go directly to reliable sources

If you need an explanation or definition of a concept, term, or idea, Wikipedia may have a ready-made answer. At the bottom of each Wikipedia page, you can find links to sources and more detailed information about the concept being described. Often, you’ll see that ChatGPT, Gemini, or others use Wikipedia as a source for definitions. So why not go directly there, thus minimizing energy consumption!

Some figures

A single query made with ChatGPT version 4 consumes 2.9 Wh2. With an average of 10 million daily queries3, this represents an annual consumption of 10,585 MWh . This is equivalent to the annual electricity consumption of approximately 3,024 Belgian households (assuming an average consumption of 
3.5 MWh/year per Belgian household4).

Please note that the above figures concern only the chat with a textual output. When you ask the generation of a image to one of these AI tool, the energy consumption can consume 60 times more energy than a “simple” text chat.

But worse are the figures related to energy consumption during training of Large Language Models (LLM), which is the heart of AI software. 1287 MWh were needed to train ChatGPT 3, which is slightly less than the annual consumption due to user queries, or 368 Belgian households . Training is carried out several times a year; this varies from company to company but is generally done about twice a year. LLM and neural network technology today does not allow for incremental updates. Retraining a model means throwing away all previous training and starting from scratch. Incremental learning is not yet possible with the current state of neural network technology.

The future

However, these figures should be taken with a grain of salt. Indeed, they are essentially the result of estimates made outside of the companies producing AI applications. And these estimates, well, must be made based on certain assumptions. And these assumptions are often maximalist, based on assumptions such as servers running at full capacity all the time with Nvidia A100 processors, which are very power-hungry. Some studies suggest a figure of 2.9 Wh per ChatGPT 4 query, as mentioned at the beginning of this article.

On the other hand, based on different assumptions, a new recent study proposes a value and ratio that is quite different from those indicated above, which nevertheless remain the most commonly accepted. This study conducted by Epoch AI5, using different assumptions, mentions the use of Nvidia H100 processors, which are more efficient, but also takes into account the operation of the servers no longer at full capacity, but only at about 70% of their capacity. In this evaluation, the consumption induced by a ChatGPT 4 query would then fall to 0.3 Wh .

And suddenly, the 10-to-1 ratio in favor of Google, which was accepted in almost all publications until now, disappears.

But let’s be honest. The figure for the energy consumption of a Google search dates back to 2009. Since then, Google has loudly proclaimed that it is improving its energy efficiency day after day. We bet that the consumption of a Google search query consumes much less energy today.

Conclusions

With the launch of DeepSeek R1 and its low-cost, energy-efficient model, we are likely to see a race to reduce energy consumption. Current projections suggest a downward trend in energy consumption.

Even so, these are still just hypotheses, or even hypotheses about what hypotheses to use!

Since AI tech giants don’t provide actual consumption figures themselves, we only have estimates from external experts. And it’s only on the basis of these estimates that we define the impact of our behavior on the planet in relation to our increasingly frequent use of AI tools.

  1. https://www.rwdigital.ca/blog/how-much-energy-do-google-search-and-chatgpt-use/ ↩︎
  2. https://www.rwdigital.ca/blog/how-much-energy-do-google-search-and-chatgpt-use/ ↩︎
  3. ChatGPT general facts ↩︎
  4. CREG data, via Wikipower (in french) ↩︎
  5. https://epoch.ai/gradient-updates/how-much-energy-does-chatgpt-use ↩︎
Suivez-nous et re-publiez SVP:
Leave a Comment

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *