The potential of artificial intelligence is immense, but its equally large energy consumption needs to stop, making shorter questions in a way of achieving, said a UNESCO study presented on Tuesday.
A combination of shorter consultations and the use of more specific models could reduce the energy consumption of AI up to 90% without sacrificing performance, UNESCO said in a published report to mark the AI for a good global summit in Geneva.
The OpenAi CEO, Sam Altman, recently revealed that each request sent to its popular chatgpt application consumes an average of 0.34 or electricity Wh, which is between 10 and 70 times a search on Google.
With Chatgpt receiving around one billion applications per day, that equals 310 GWh per year, equivalent to annual electricity consumption of three million people in Ethiopia, for example.
In addition, UNESCO calculated that the demand for the energy of AI is being doubled every 100 days as the generative tools are integrated into everyday life.
“The exponential growth in the computational energy necessary to execute these models is to exercise a growing tension in global energy systems, water resources and critical minerals, which increases concerns about environmental sustainability, equitable access and competition on limited resources,” warned the UNESCO report.
However, it was able to achieve a reduction of almost 90% in the use of electricity by reducing the length of its consultation or immediate, as well as by using a smaller AI, without a drop in performance.
Many AI models such as Chatgpt are general models designed to respond to a wide variety of topics, which means that they must examine an immense volume of information to formulate and evaluate the answers.
The use of specialized and smaller models offers important reductions in electricity necessary to produce an answer.
So did the cut of the cut indications from 300 to 150 words.
When aware of the energy problem, technological giants now offer miniature versions with less parameters of their respective large language models.
For example, Google Sells Gemma, Microsoft has Phi-3 and Openai has GPT-4o Mini. French ‘companies have done the same; For example, Mistral AI has introduced its, ministral model.