The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.
A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.
Response Type is decided by ChatGPTs new routing function based on your input. So yeah. Asking it to “think long and hard”, which I have seen people advocating for to get better results recently, will trigger the thinking model and waste more resources.
So instead of just saying “thank you” I now have to say “think long and hard about how much this means to me”?
Once it generates the response, there is a button you can click to make it use the reasoning model.
Why they did it that way instead of giving users the option to just set the model that they want to use ahead of time boggles the mind. Surely it would be more efficient for them to chose a model if they want ahead of time, rather than generating something that’s going to be regenerated with the desired model instead.
FFS, I have been using Claude to code, not only do you have to tell Claude to fix compilation errors, you have to point out when Claude says “it’s fixed” - “no, it’s not, the function you said you added is STILL missing.”
You’re absolutely right!
If you want it to really use a lot of energy on receiving your gratitude, sure I guess^^