Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"The University of Rhode Island based its report on its estimates that producing a medium-length, 1,000-token GPT-5 response can consume up to 40 watt-hours (Wh) of electricity, with an average just over 18.35 Wh, up from 2.12 Wh for GPT-4. This was higher than all other tested models, except for OpenAI's o3 (25.35 Wh) and Deepseek's R1 (20.90 Wh)."

https://www.tomshardware.com/tech-industry/artificial-intell...

https://app.powerbi.com/view?r=eyJrIjoiZjVmOTI0MmMtY2U2Mi00Z...

https://blog.samaltman.com/the-gentle-singularity



These numbers don't pass sanity check for me. With 4x300W cards you can get a 1K token DeepSeek R1 output in about 10 seconds. That's just 3.3Wh right? And that's before you even consider batching.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: