The Complex Challenge of Powering AI Technologies

Artificial intelligence is becoming important in business and financial transactions, healthcare, technology development, research, and more. Although consumers don’t realize it, they rely on AI when they stream video, bank online, and search online. Behind these functions are more than 10,000 data centers around the world, each a giant warehouse with thousands of computer servers and other infrastructure to store, manage, and process data. There are currently more than 5,000 data centers in the United States, and new ones are built every day in the U.S. and around the world. Dozens of centers are clustered close to where people live, often attracted by policies offering tax breaks and other incentives, or by a perceived abundance of electricity.

And data centers consume huge amounts of electricity.  According to the Electric Power Research Institute, U.S. data centers will consume more than 4% of the nation’s total electricity in 2023, and that share could rise to 9% by 2030. A large data center can consume as much electricity as 50,000 homes. 

The sudden demand for so many data centers poses a major challenge for the technology and energy industries, government policymakers, and consumers. Researchers and faculty at the MIT Energy Initiative (MITEI) are investigating many aspects of the problem, from power sources to grid improvements to analytical tools to increase efficiency. Data centers are quickly becoming the energy problem of our time.

Unanticipated needs bring unexpected solutions

Some companies that use data centers to provide cloud computing and data management services have announced surprising steps to provide all that power. Proposals include building their own small nuclear plants near data centers or even restarting one of the intact reactors at Three Mile Island, which has been closed since 2019. (Another reactor at the plant had a partial meltdown in 1979, causing the United States’ worst nuclear accident.) The need to power AI has delayed the planned closure of several coal-fired power plants and raised prices for residential consumers. Meeting the demands of data centers will not only strain the power grid, but it will also slow the transition to clean energy needed to fight climate change.

From an energy perspective, there are many different aspects to the data center problem. Here are some that MIT researchers are focusing on and why they matter.

Unprecedented surge in electricity demand

“Historically, computing hasn’t consumed much electricity,” said William H. Green, director of MITEI and the Hoyt C. Hottel Professor of Chemical Engineering at MIT. “Electricity is used to run industrial processes, to power appliances like air conditioners and lights, and more recently to power heat pumps and charge electric vehicles. But now, suddenly, the power used by computing in general, and data centers in particular, is becoming a huge new demand that no one expected.”

Why the lack of foresight? Typically, demand for electricity grows by about 0.5% each year, and utilities install new generators and make other investments as needed to meet the expected new demand. But the data centers that are now online are creating an unprecedented surge in demand that operators didn’t anticipate. And new demand is constantly emerging. It’s critical that data centers continue to provide service 24 hours a day, every day. There can be no interruption in processing large datasets, accessing stored data, and running the cooling equipment needed to keep all those computers running without overheating.

And even if enough power is generated, getting it to where it’s needed can be a problem, explains Deepjyoti Deka, a research scientist at MITEI. “The grid is a network-wide operation, and while the grid operator may have enough power in another place, or even in another part of the country, the wires may not have the capacity to get the power to where it’s needed.” So transmission capacity needs to be expanded, which Deka says is a slow process.

Then there is the “connection queue”. In some cases, adding a new user (“load”) or a new generator to an existing grid can cause instability and other issues for all other users already on the grid. In these situations, introducing new data centers into the network can be delayed. If the delay is large enough, the new load or generator will be queued up and waiting its turn. Currently, the majority of the interconnection queue is filled by new solar and wind projects. The current delay is about 5 years. Meeting the demand from the newly installed data centers while ensuring that the quality of service elsewhere is not compromised is the problem to be solved.

In search of clean electricity

Further complicating the challenge, many companies, including so-called “hyperscalers” like Google, Microsoft, and Amazon, have committed to becoming net-zero carbon-emissions within the next decade. Many companies have made great strides toward meeting their clean energy targets by purchasing “power purchase agreements.” They contract to buy electricity from solar or wind farms, for example, and sometimes even finance the construction of the facilities. But this approach to clean energy has its limits when faced with the massive power demands of data centers.

Meanwhile, many states are delaying the closure of coal-fired power plants due to a surge in electricity consumption. There is not enough renewable energy to serve both superscalers and existing users, including individual consumers. As a result, traditional power plants that burn fossil fuels such as coal are more necessary than ever.

When hyperscale companies are looking for clean energy sources for their data centers, building their own wind or solar farms may be one option. But such farms can only generate electricity intermittently. The need for an uninterrupted power supply requires data centers to maintain costly energy storage units. Instead, they could turn to natural gas or diesel generators for backup power, but this requires combining equipment to capture carbon dioxide emissions with a nearby site to permanently dispose of the captured carbon dioxide.

These complexities have led some hyperscale companies to turn to nuclear power, Green said. Nuclear power plants can generate large amounts of electricity reliably without interruption, making them a good fit for the needs of data centers.

In September, Microsoft signed a 20-year power purchase agreement after Constellation Energy restarted one of the undamaged reactors at the shuttered Three Mile Island nuclear plant, the site of a highly publicized nuclear disaster in 1979. If approved by regulators, Constellation will open the reactor in 2028, and Microsoft will buy all the electricity produced by the plant. Amazon also signed an agreement to buy electricity produced by another nuclear plant that is in danger of closing due to financial difficulties. And in early December, Meta released a request for proposals to identify nuclear energy developers that could help meet its AI needs and sustainability goals.

Other nuclear news has focused on small modular reactors (SMRs), modular power plants built in factories that can be installed close to data centers, potentially avoiding the cost overruns and delays that often come with building large-scale plants. Google recently ordered a fleet of SMRs to generate the energy needed for its data centers, with the first to be completed in 2030 and the rest in 2035.

Some hyperscale companies are betting on new technologies: Google, for example, is moving forward with a next-generation geothermal project, and Microsoft has signed a deal to buy power from a startup’s fusion power plant from 2028, even though the technology is yet to be proven.

Reducing electricity demand

Other approaches to providing clean power have focused on making data centers and the operations within them more energy efficient, performing the same computing tasks using less power. By using faster computer chips and optimizing algorithms to consume less energy, we could reduce both the load and the heat generated.

Another idea being tested is migrating computing tasks to times and places where zero-carbon energy is available on the grid. “If a task doesn’t need to be done right away, but needs to be completed by a certain deadline, can that task be postponed or moved to other data centers in the U.S. or overseas where power is more plentiful, cheaper, and cleaner? This approach is called ‘carbon-aware computing,'” Deka explains. Deka says it’s still unclear whether all tasks can be easily moved or postponed. “If you think about the tasks that AI generates, can you easily split it into smaller tasks, run them on clean energy, send them to different parts of the country, and then run them together? How much would that task split cost?”

Of course, this approach is limited by connection queuing issues — it’s hard to access clean energy in another region or state — but efforts are underway to ease the regulatory framework so that these vital connections can be made faster and easier.

What about your neighbors?

A major concern with all data center power options is the impact on residential energy consumers. When a data center is located in a residential area, there are aesthetic concerns as well as more practical ones. Will the local power supply be less reliable? Where will the new transmission lines be installed? And who will pay for things like new generators and upgrades to existing equipment? When a new manufacturing facility or industrial plant is built in a residential area, the disadvantage is often offset by the creation of new jobs. But that’s not the case with a data center, which only requires a few dozen employees.

There are standard rules for how maintenance and upgrade costs are shared and allocated. But the arrival of new data centers has changed the landscape. As a result, utilities are having to rethink traditional pricing structures to ensure residents aren’t overly burdened with the infrastructure changes required to host data centers.

MIT’s contribution

At MIT, researchers are brainstorming and exploring different options to solve the problem of providing clean energy to data centers. For example, they are studying architectural designs that use natural ventilation to boost cooling, equipment layouts that improve airflow and power distribution, and energy-efficient air conditioning systems based on new materials. They are creating new analytical tools to assess the impact of data center deployment on the U.S. power system and find the most efficient way to provide clean electricity to the facilities. Other research is looking at how to combine the power of small nuclear reactors with the needs of data centers and how to accelerate the construction of such reactors.

The MIT team is also focusing on identifying optimal sources of backup power and long-term storage, and developing a decision support system to determine the location of proposed new data centers, taking into account power and water availability, regulatory considerations, and the potential for using large amounts of waste heat for heating nearby buildings, etc. Technology development projects include designing faster and more efficient computer chips and more energy-efficient computing algorithms.

In addition to providing leadership and funding for many research projects, MITEI is also acting as a convener to bring companies and stakeholders together to address this issue: At MITEI’s 2024 annual research conference, representatives from two hyperscalers and two data center design and construction companies came together to discuss challenges, possible solutions, and areas where MIT research can best help.

As data centers are built out and computing continues to create unprecedented demand for electricity, scientists and engineers are racing to provide ideas, innovations and technologies that can meet this demand while continuing to accelerate the transition to a carbon-free energy system, Green said.

Để lại một bình luận

Email của bạn sẽ không được hiển thị công khai. Các trường bắt buộc được đánh dấu *