The University of Rhode Island’s AI lab estimates that GPT-5 averages just over 18 Wh per query, so putting all of ChatGPT’s reported 2.5 billion requests a day through the model could see energy usage as high as 45 GWh.

A daily energy use of 45 GWh is enormous. A typical modern nuclear power plant produces between 1 and 1.6 GW of electricity per reactor per hour, so data centers running OpenAI’s GPT-5 at 18 Wh per query could require the power equivalent of two to three nuclear power reactors, an amount that could be enough to power a small country.

  • joonazan@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    3
    ·
    2 days ago

    My guess would be that using a desktop computer to make the queries and read the results consumes more power than the LLM, at least in the case of quickly answering models.

    The expensive part is training a model but usage is most likely not sold at a loss, so it can’t use an unreasonable amount of energy.

    Instead of this ridiculous energy argument, we should focus on the fact that AI (and other products that money is thrown at) aren’t actually that useful but companies control the narrative. AI is particularly successful here with every CEO wanting in on it and people afraid it is so good it will end the world.