Misconception About ChatGPT's Energy Consumption Debunked: Google's Energy Usage Is Not 10 Times Greater Than ChatGPT's as Initially Believed
In the realm of AI, there's a figure that's been making the rounds in headlines, reports, and social media: ChatGPT. It's been claimed that this badass AI system guzzles up ten times more electricity per question than a Google search. Seems legit, right? I mean, it can write sonnets, debug code, and explain quantum mechanics - all in a friendly conversation!
But hold your horses, partner. Let's take a closer look.
According to both independent research and the big cheese himself, OpenAI CEO Sam Altman, the latest ChatGPT models burn through roughly 0.3 watt-hour per query. That's exactly how much energy Google reported for its average query back in 2009, the last time they shared such info.
Brush Up on the Past
The 10-to-1 energy comparison between ChatGPT and Google searches might trace back to a 2023 estimate by data scientist Alex de Vries. He calculated a ChatGPT query at consuming roughly 3 watt-hours of electricity. On the other hand, the typical energy cost of a Google search was usually cited as 0.3 watt-hours, a figure published by Google in 2009.
It's clear that this comparison needs a second look, as there are a lot of questions surrounding the accuracy of the data used.
Google's 2009 estimate came from an era before smartphones were everywhere and YouTube was owned by Google. The internet was a different beast then. And while Google's data centers have become more efficient over the years, they're now using AI search in almost all their queries, bumping organic search results to the side.
As for ChatGPT, the models, hardware, and deployment systems have all evolved rapidly in the last year.
Time for a New Standard
Recent work by the researchers at Epoch.ai suggests that the average ChatGPT query using OpenAI's GPT-4o model requires only 0.3 watt-hours of energy. That's around 10 times lower than the widely-cited 3 watt-hour estimate!
Even Sam Altman, OpenAI's CEO, has echoed this value in his recent essay, "The Gentle Singularity." He noted that the average query uses about 0.34 watt-hours and compared it to what an oven uses in a second or an LED lightbulb in a few minutes.
It's well past time for a revised benchmark. Let's get this show on the road!
- The energy consumption comparison between ChatGPT and Google searches might have originated from a 2023 estimate by data scientist Alex de Vries, who calculated a ChatGPT query at 3 watt-hours, whereas the typical energy cost of a Google search was usually cited as 0.3 watt-hours.
- Nevertheless, the accuracy of the data used for this comparison is questionable.
- Google's 2009 estimate was made in an era before the widespread use of smartphones and when YouTube was not yet owned by Google, so the internet was different back then.
- Even though Google's data centers have become more energy-efficient over the years, they now use AI search in almost all queries, pushing organic search results aside.
- More recent research by the researchers at Epoch.ai indicates that the average ChatGPT query using the GPT-4o model requires only 0.3 watt-hours of energy, which is around 10 times lower than the widely-cited 3 watt-hour estimate. Thisvalue has also been reported by OpenAI's CEO, Sam Altman, in his essay, "The Gentle Singularity."