- Chatbots and videos of AI use a lot of energy and water
- A five -second AI video uses as much energy as a microwave that runs for an hour or more
- The energy use of the data center has been doubled since 2017, and AI will represent half of OT by 2028
It only takes a few minutes in a microwave exploding a potato that has not ventilated, but as much energy is needed as that microwave for more than an hour and more than a dozen potato explosions so that an AI model makes a five -second video of a potato explosion.
A new study by Mit Technology Review has presented how famous are the AI models for energy. A basic chatbot response could use only 114 or up to 6,700 joules, between half a second and eight seconds, in a standard microwave, but that is when things become multimodal that energy costs are fired at one more hour in the microwave, or 3.4 million joules.
It is not a new revelation that AI has a lot of energy, but MIT work establishes mathematics in marked terms. The researchers devised what could be a typical session with an AI chatbot, where he asks 15 questions, request 10 images generated by AI and add requests for three different videos of five seconds.
You can see a realistic fantasy film scene that seems to be filmed in your backyard for a minute after asking for it, but you will not notice the enormous amount of electricity that it demanded to produce it. He has requested approximately 2.9 kilowatts-Hora, or three and a half hours of microwave time.
What makes the costs of AI stand out is how painless it feels from the user’s perspective. You are not budgeting messages as we all did with our text messages 20 years ago.
Ai Energy Rehink
Of course, you are not undermining Bitcoin, and your video at least has a real world value, but that is a really low bar to pass when it comes to ethical energy use. The increase in energy demands of data centers is also occurring at a ridiculous pace.
The data centers had stagnated in their use of energy before the recent explosion of AI, thanks to efficiency gains. However, the energy consumed by the data centers has doubled since 2017, and around half will be for the 2028, according to the report.
This is not a guilt trip, by the way. I can claim professional demands for some of my use of AI, but I have used it for all types of recreational fun and to help with personal tasks as well. I would write an apology to people who work in the data centers, but I would need to translate it for the language spoken in some of the locations of the data centers. And I don’t want to sound heated, or at least not as heated as those same servers. Some of the largest data centers use millions of water gallons daily to remain ice cream.
Developers behind the infrastructure of AI understand what is happening. Some are trying to get cleaner energy options. Microsoft is looking to make a deal with nuclear plants. The AI may or may not be an integral part of our future, but I would like that future not to be full of extension cables and boiling rivers.
At the individual level, its use or avoidance of AI will not make a big difference, but encouraging better energy solutions from the owners of the data centers could do so. The most optimistic result is to develop more efficient chips in energy, better cooling systems and more ecological energy sources. And perhaps AI’s carbon footprint should be discussed as any other energy infrastructure, such as transport or food systems. If we are willing to discuss the sustainability of almond milk, we can surely dedicate a thought for the 3.4 million joules that are needed to make a five -second video of a cartoon almond.
As tools such as Chatgpt, Gemini and Claude become smarter, faster and more embedded in our lives, the pressure on energy infrastructure will only grow. If that growth occurs without planning, we will be trying to cool a supercomputer with a paper fan while we chew a raw potato.