Insatiable Intelligence: How Much Electricity Do Neural Networks Consume

Сергей Мацера Exclusive
VK X OK WhatsApp Telegram
Insatiable Intelligence: How Much Electricity Neural Networks Consume

The creation of more complex and large-scale artificial intelligences has become complicated due to their high demand for electricity, which threatens the ability to provide them with the necessary resources, as reported by the media.

The Gluttony of AI

As the size of neural networks increases, so does their "intelligence." During the training process, they process colossal amounts of data, adjusting billions or even trillions of their internal variables. The more parameters a model has, the easier it is for it to identify patterns. For instance, GPT-1, released in 2018, used 117 million variables, while GPT-2 had about 1.5 billion, and GPT-4 has over 1.7 trillion. However, as the size of neural networks has grown, their electricity consumption has significantly increased: training GPT-2 cost OpenAI $50,000, while the expenses for training GPT-4 exceeded $100 million.
Leading companies in the AI sector are striving to invest in energy infrastructure. For example, Google and Amazon Web Services are developing projects to build nuclear reactors specifically designed to power their servers, while Microsoft has announced plans to construct the Helion fusion power plant in Washington State to support its computing capabilities.

Significant funds are required to ensure the operation of data centers, including the power and cooling of numerous graphics processors. Energy and financial costs are rising every year. According to experts, training GPT-4 consumed 51–62 GWh of electricity, which corresponds to the consumption of the city of San Francisco over several days.

According to the International Energy Agency (IEA), computing and data storage currently account for up to 1.5% of the total global electricity consumption. In the U.S., this figure has reached 4.4%, and it is expected to grow to 12% by 2028. However, new large data centers could exacerbate the situation: a data center is planned to be built in Wyoming that will consume up to 87.6 TWh per year — five times more than the total consumption of all residents of the state and double the current generation capacity.

“The energy appetite of AI exceeds the pace of energy system development, creating a critical 'bottleneck,'” specialists note.

Costs of Queries

Energy is consumed not only for training neural networks but also for their use, that is, for processing queries. Although each individual task requires a negligible amount of resources, with millions of queries, the figures become quite significant.

Researchers from Carnegie Mellon University found that a typical text query requires only 0.47 Wh, however, in a month, GPT-4, processing about 1.2 billion such queries, will consume over 550 MWh. Image generation is even more costly: creating one image consumes roughly the same amount of energy as charging a smartphone. However, precise data on electricity consumption by various language models remains unavailable, as developers are reluctant to share this information.

As of today, there is no current official statistics on electricity consumption by AI systems in Russia. However, experts from Gbig Holdings predict that in 2025 it will amount to 8–12 TWh, which is less than 1% of the total generation volume. In the most likely scenario, by 2030, neural networks will consume between 30 and 50 TWh, and with the active implementation of AI in industry and government, up to 80–100 TWh. These figures are not yet critical for the country's energy system and do not require significant investments in its development.

In the summer of 2025, OpenAI CEO Sam Altman reported that one query to GPT-4 requires 0.34 Wh, while Google noted that each request to its Gemini model consumes 0.24 Wh. Although these values are not large, users have already begun to face limitations on the use of neural networks or the need to pay for access to them. On the other hand, this situation encourages developers to consider another important factor — the "energy cost" of models. For example, the Chinese company DeepSeek has become a notable player by offering features similar to those provided by large neural networks but with consumption 2–2.5 times lower.
VK X OK WhatsApp Telegram

Read also:

Hydropower Resources

Hydropower Resources

Water is essential for electricity production: over 90% of the electricity for household...

Write a comment: