Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Chatgpt may not be not thirsty for power as it is assumed once


ChatgptThe Openai Chatbot platform, thirsty for power as it was assumed once. But her appetite largely depends on how to use ChatGPT and artificial intelligence models that answer inquiries, according to a new study.

A Modern analysis By EPOCH AI, the Institute of Non -profit artificial intelligence research tried to calculate the amount of energy consumed by the typical Chatgpt query. A Statistics that were commonly martyred Does Chatgpt require about 3 hours of strength to answer one question, or 10 times what Google Search is.

Epoch believes this is exaggerated.

Using the latest Openaii default model for Chatgpt, GPT-4OAs a reference, EPOCH found that the average ChatGPT query consumes about 0.3 watts-less than many home appliances.

“The use of energy is not really a big problem compared to the use of regular devices, heating, cooling your home, or driving a car,” said Joshua, an EPOC data analyst who performed the analysis, Techcrunch.

The use of artificial intelligence energy – and its environmental impact on a large scale – is the subject of controversial debate as artificial intelligence companies are looking to expand the effects of infrastructure quickly. Only last week, a group of more than 100 organizations I published an open message Inviting the industrial intelligence industry and organizers to ensure that the new artificial intelligence data centers are not exhausted by natural resources and forced facilities to rely on non -renewable energy sources.

TECHCRUNCH told his analysis driven by what he described as outdated research. I have indicated, for example, that the author of the report that has reached 3 Watt Hours assumes that Openai used old and less efficient chips to run his models.

Epoch Ai Chatgpt Energy Consplish
Image credits:The era of artificial intelligence

“I have seen a lot of public discourse that correctly realized that artificial intelligence will consume a lot of energy in the coming years, but did not describe the energy that would have gone to artificial intelligence today.” “Also, I noticed some of my colleagues that the most extensively reported estimates of 3 -watt in each query were based on ancient research, and based on some of the napkin mathematics it seems very high.”

Grant, 0.3 watt era is round, also; Openai has not published the details needed to make an accurate account.

The analysis is also not seen in the additional energy costs incurred by Chatgpt features such as generating images or input processing. She has admitted that Chatgpt inquiries “Long input” – Information that contains long connected files, for example – it is likely to consume a lot of electricity more than a typical question.

I have said that he expects the primary energy consumption to increase the chat.

() Artificial intelligence will get more advanced, and this artificial intelligence is likely to require more energy, and this artificial intelligence may be used in the future intensively – processing more tasks, and the most complex tasks, from how people use Chatgpt today, “you said .

While there was Wonderful breakthroughs In the efficiency of artificial intelligence in recent months, it is expected that the scale that is published from artificial intelligence is expected to pay the tremendous expansion of energy -thirsty infrastructure. In the next two years, artificial intelligence centers may need approximately all power in California 2022 (68 GB), According to the Rand report. By 2030, the Border model training may require the production of the equivalent energy of those in eight nuclear reactors (8 GB), as the report predicted.

Chatgpt alone reaches a huge number – and expand -, which causes his servant to require greatly. Openai, along with many investment partners, plans to Spending During the next few years.

Openai’s interest – along with the rest of the artificial intelligence industry – is transmitted to thinking models, which are more capable of the tasks that it can accomplish but require more computing. Unlike models such as GPT-4O, which responds almost immediately, thinking models are “thinking” for seconds to minutes before answering, a process that absorbs more computing-and thus strength.

“Thinking models will grow increasingly tasks that old models cannot, and create more (data) to do so, and both require more data centers,” she said.

Openai began to launch the most efficient energy thinking models such as O3-Mini. But it seems unlikely, at least in this turn, that efficiency gains will compensate for the increasing energy requirements of the thinking of thinking models and developing artificial intelligence use around the world.

I have suggested that people who are worried about artificial intelligence energy emissions use applications such as ChatGPT, or choose models that reduce the necessary computing – to the extent that is realistic.

“You can try to use models of smaller artificial intelligence such as (Openai’s) GPT-4O-MINI,” I said, “and use it slightly in a way that requires processing or generating a lot of data.”

Leave a Reply

Your email address will not be published. Required fields are marked *