Europe faces a dilemma as AI growth drives demand for data centers and conflicts with environmental objectives.
- The increase in demand for AI may hinder Europe's efforts to reduce carbon emissions due to the energy-intensive chips used by companies like Nvidia, which are expected to increase energy consumption in already power-hungry data centers.
- To maintain reliable cooling of high-powered AI chips, colder water is necessary due to the extreme computing power required by these chips, which generates more heat.
- Michael Winterson, chair of the European Data Center Association (EUDCA), cautioned that reducing water temperatures in data centers could lead to a situation that is unsustainable, similar to the one we faced 25 years ago.
The increasing use of artificial intelligence is driving a shift towards environmentally conscious data center operations, with European developers under pressure to reduce water temperatures in their energy-intensive facilities to accommodate the higher-powered chips of companies like tech giant.
By 2030, the demand for data centers is predicted to increase by 160% due to AI, according to research from Goldman Sachs. This growth could negatively impact Europe's efforts to reduce carbon emissions, as the energy-intensive chips used by AI companies are expected to increase the energy consumption of data centers.
Graphics processing units (GPUs), also known as high-powered chips, are crucial for training and deploying large language models, a type of AI. These GPUs require high density computing power and generate more heat, which necessitates the use of colder water for reliable cooling.
The energy consumption of AI in a single square meter of a data center is equivalent to the power consumption and heat dissipation of 15 to 25 houses, as stated by Andrey Korolenko, chief product and infrastructure officer at Nebius, specifically regarding the deployment of Nvidia's Blackwell GB200 chip.
He stated that it is extremely dense and requires different cooling solutions.
Michael Winterson, chair of the European Data Center Association (EUDCA), cautioned that reducing water temperatures could lead to a situation that is unsustainable, similar to the one we faced 25 years ago.
Winterson stated to CNBC that the issue with chipmakers is that AI has become a space race driven by the American market, where land rights, energy access, and sustainability are of low importance, and market domination is the key priority.
NDC-GARBE's managing director, Herbert Radlinger, claims that major equipment suppliers in Europe are being approached by U.S. chip designers to reduce their water temperatures in order to accommodate the increasing heat generated by AI chips.
According to the speaker, the news is surprising because the engineering team initially believed that liquid cooling would be used to achieve higher temperatures, but air cooling turned out to be more efficient.
'Evolution discussion'
The European Commission aims to reduce energy consumption by 11.7% by 2030, with data centers predicted to increase energy consumption by 28% by 2030. However, the advent of AI is expected to significantly increase this number in some countries.
Winterson stated that reducing water temperatures is incompatible with the EU's Energy Efficiency Directive, which requires data centers of a certain size to publicly report their power consumption. The EUDCA has been advocating for sustainability concerns in Brussels.
Schneider Electric frequently collaborates with the EU on energy management, with recent discussions focusing on sourcing "prime power" for AI data centers and exploring more collaboration with utilities, according to Steven Carlini, chief advocate of AI and data centers and vice president at Schneider Electric.
Nvidia has had discussions with European Commission energy officials about energy consumption and the use of data centers, as well as the effectiveness of power use and chipsets.
CNBC has approached Nvidia and the Commission for comment.
"According to Carlini, cooling is the second-largest consumer of energy in data centers, after the IT load. Although the energy use will increase, the PUE (Power Usage Effectiveness) may not increase with lower water temperatures, despite the chillers having to work harder."
Customers of Schneider Electric deploying Nvidia's Blackwell GB200 super chip are requesting water temperatures between 68 and 75 degrees Fahrenheit or 20-24 degrees Celsius, according to Carlini.
He stated that the temperatures he experienced were similar to those of liquid cooling, which is around 32 degrees Celsius, or Meta's suggested water temperature of around 30 degrees Celsius for hardware.
Equinix's U.K. vice president of data center operations, Ferhan Gunen, shared with CNBC that there are several AI-related concerns that the company has been discussing with its customers.
"She stated that they desire to increase server density by using higher-power-consuming chips or adding more servers, but the shift is not straightforward."
Gunen stated that the topic at hand was more about evolution than anything else.
Nvidia unveiled a new platform for its Blackwell GPUs that allows organizations to run real-time generative AI on large language models at up to 25 times less cost and energy consumption compared to earlier technology. Despite declining to comment on the cooling requirements of its chips, Nvidia stated that the architecture would enable this significant reduction in cost and energy consumption.
Gunen stated that liquid cooling will necessitate a "reconfiguration" and that new data centers are already equipped with this technology. She explained that higher density will result in increased power usage and cooling requirements. However, she emphasized that the technology is evolving, and the approach to cooling is changing. As a result, there is a balance to be struck in all of this.
Race for efficiency
Nebius, after splitting from Yandex in Russia, has stated that it will be among the first to introduce Nvidia's Blackwell platform to customers in 2025. Additionally, the company has revealed plans to invest over $1 billion in AI infrastructure in Europe by the middle of next year.
Nebius' Korolenko stated that liquid cooling is a "first step," although the initial cost of ownership will be higher, it will eventually improve over time.
Korolenko stated that there is a strong demand to deliver, but when scaling up, it is crucial to have the flexibility to choose while being cost-effective. Power efficiency is essential for managing running costs. This is always a top priority.
Despite the growing digital sector, the European data center industry was already struggling to keep up with demand before the AI application market experienced a surge.
ING's TMT team managing director Sicco Boomsma stated that those in the market are highly attuned to power dynamics. While Europe's emphasis is on infrastructure development, the U.S. has prioritized expanding assets in regions with abundant power sources.
"A significant number of data center operators from the U.S. are aligning to meet the EU's goals, including carbon neutrality, efficiency, water utilization, and biodiversity conservation."
He stated that it's a race to showcase how their knowledge results in highly efficient infrastructure.
Technology
You might also like
- Tech bros funded the election of the most pro-crypto Congress in America.
- Microsoft is now testing its Recall photographic memory search feature, but it's not yet flawless.
- Could Elon Musk's plan to reduce government agencies and regulations positively impact his business?
- Some users are leaving Elon Musk's platform due to X's new terms of service.
- The U.S. Cyber Force is the subject of a power struggle within the Pentagon.