In a recent blog post, OpenAI CEO Sam Altman shed light on a fascinating yet often overlooked topic—the actual energy and water consumption of AI models like ChatGPT.
🧠 AI Intelligence vs. Energy Consumption
Altman shared that each ChatGPT query consumes about 0.34 watt-hours of electricity. To give you some perspective, that’s about the same amount of energy your oven would use in just over one second, or what a modern LED bulb consumes in a couple of minutes. Pretty efficient when you think about the scale of information processing happening behind the scenes.
💧 What About Water?
Surprisingly, every ChatGPT query uses about one-fifteenth of a teaspoon of water. That’s roughly 0.000085 gallons per query, just a few drops—used mostly to cool the powerful data centers that keep these models running.
⚡ AI Getting Cheaper and Greener
Sam Altman also made an interesting prediction—the cost of artificial intelligence will eventually fall close to the cost of electricity itself. This could mean a future where AI tools become more accessible, affordable, and eco-friendly.
🌍 As the world moves towards sustainable technology, these developments could play a vital role in shaping how we use AI in everyday life—from generating creative content to solving complex global challenges.
✅ Stay tuned with Saatpro for more insightful updates on AI, tech trends, and future innovations.