Generative AI’s rapid development and deployment have significant environmental consequences, notably increased electricity demand and water consumption. Training large models like OpenAI’s GPT-4 requires substantial computational power, leading to higher carbon dioxide emissions and placing stress on electric grids. Additionally, cooling the hardware used for training and deploying these models demands considerable water, potentially straining municipal supplies and disrupting local ecosystems. The growing number of generative AI applications also drives demand for high-performance computing hardware, contributing to environmental impacts from its manufacture and transport. citeturn0search0 navlistRecent Developments on AI’s Environmental Impactturn0news13,turn0news15,turn0news18

 MIT News  is investigating the environmental impact of generative AI in a two-part series  . In this article, we look at why the technology is so resource-intensive. In part two, we’ll look at the efforts experts are making to reduce genAI’s carbon footprint and other impacts.

From improving workforce productivity to advancing scientific research, the promise of generative AI’s potential benefits is hard to ignore. The explosive growth of this new technology has enabled the rapid deployment of powerful models across many industries, but the environmental impacts of this generative AI “gold rush” remain difficult to determine, let alone mitigate.

The computational power required to train generative AI models, such as OpenAI’s GPT-4, which often have billions of parameters, can require vast amounts of electricity, increasing carbon emissions and straining the power grid.

Moreover, deploying these models into real-world applications, enabling millions of people to use generative AI in their daily lives, and then fine-tuning the models to improve their performance, requires a significant amount of energy after the models are developed.

In addition to electricity demands, large amounts of water are required to cool the hardware used to train, deploy, and refine generative AI models, potentially straining city water supplies and disrupting local ecosystems. The rise of generative AI applications will also drive demand for high-performance computing hardware, as well as the indirect environmental impacts of manufacturing and shipping.

“When we think about the environmental impact of generative AI, it’s not just the electricity that a computer uses when you plug it in,” says Elsa A. Olivetti, a professor in MIT’s Department of Materials Science and Engineering and director of the Carbon Reduction Mission at MIT’s New Climate Project. “There are broader implications that affect the systems level and that arise based on our actions.”

Olivetti is lead author of a 2024 paper, “Climate and Sustainability Implications of Generative AI,” co-written with MIT colleagues in response to a lab-wide call for papers exploring the transformative potential of generative AI, both positive and negative for society.

Data centers in high demand

Data centres are used to train and run the deep learning models behind popular tools such as ChatGPT and DALL-E, so their power demands are one of the main factors contributing to AI’s environmental impact.

A data center is a temperature-controlled building that houses computer infrastructure such as servers, data storage, networking equipment, etc. For example, Amazon owns over 100 data centers around the world, each containing approximately 50,000 servers that it uses to support its cloud computing services.

Although data centers have existed since the 1940s (the first was built at the University of Pennsylvania in 1945 to support ENIAC, the first general-purpose digital computer), the development of generative AI has dramatically accelerated the pace of data center construction.

“What makes generative AI different is the energy density it requires. Though it’s basically just computation, generative AI training clusters can consume seven to eight times more energy than typical compute workloads,” said Noman Bashir, lead author of the impact paper, a researcher in Computing and Climate Impacts at the MIT Climate and Sustainability Consortium (MCSC) and a postdoctoral fellow at the Computer Science and Artificial Intelligence Laboratory (CSAIL).

Scientists predict that North American data center electricity demand will grow from 2,688 megawatts at the end of 2022 to 5,341 megawatts by the end of 2023, driven in part by demand created by AI. Globally, data center electricity consumption is expected to grow to 460 terawatts by 2022. This will make data centers the 11th largest consumer of electricity in the world, behind Saudi Arabia (371 terawatts) and France (463 terawatts), according to the Organisation for Economic Cooperation and Development (OECD).

By 2026, datacenter power consumption is expected to reach 1,050 terawatts (which would place datacenters fifth in the world, between Japan and Russia).

While not all data center computing incorporates generative AI, the technology remains a major driver of growing energy demand.

“The demand for new data centers cannot be met in a sustainable way,” Bashir said. “Given the pace at which companies are building new data centers, the majority of the electricity to power them will have to come from fossil fuel-fired power plants.”

The amount of electricity required to train and deploy a model like OpenAI’s GPT-3 is difficult to determine: In a 2021 research paper, scientists from Google and the University of California, Berkeley estimated that the training process alone would consume 1,287 megawatt-hours of electricity (enough to power about 120 average U.S. homes for a year) and generate about 552 tonnes of carbon dioxide.

Bashir explains that all machine learning models need to be trained, but a particular problem with generative AI is that energy usage fluctuates dramatically during different stages of the training process.

Grid operators need to have a way to absorb these fluctuations to protect the grid, and they often use diesel generators to do this.

Increase the impact of your reasoning

Once generative AI models are trained, energy demand will never go away.

Every time the model is used, such as when an individual asks ChatGPT to summarize an email, the computer hardware that performs those operations consumes energy: The researchers estimate that a ChatGPT query uses about five times more power than a simple web search.

“But everyday users don’t think much about it,” Bashir says. “The ease of use of generative AI interfaces and the lack of information about the environmental impact of their actions mean that as users, they have little incentive to cut back on their use of generative AI.”

In traditional AI, energy usage is split evenly between data processing, model training, and inference (the process of using a trained model to make predictions on new data). But Bashir expects that as these models become more prevalent in many applications, the power demands of generative AI inference will eventually dominate, and that as future versions of models become larger and more complex, the power required for inference will increase.

Moreover, generative AI models have a particularly short shelf life given the growing demand for new AI applications. Companies release new models every few weeks, wasting the energy spent training the previous version, Bashir added. New models often have more parameters than previous models, so they often take more energy to train.

While the electricity demands of data centers may receive the most attention in the research literature, the water consumption of these facilities also has an environmental impact.

The chilled water is used to cool data centers by absorbing heat from computer equipment. Bashir said he estimates that for every kilowatt-hour of energy a data center consumes, it needs two liters of water for cooling.

“Just because it’s called ‘cloud computing’ doesn’t mean the hardware is in the cloud. Data centres exist in our physical world and have direct and indirect impacts on biodiversity through their water use,” he said.

The computer hardware in the data center also has less direct impact on the environment.

It is difficult to estimate how much electricity is needed to manufacture a GPU, a powerful type of processor that can handle the intensive workloads generated by AI, but it is more than the power needed to manufacture a simpler CPU because the manufacturing process is more complex. The carbon footprint of a GPU is increased by emissions associated with transporting materials and products.

Environmental impacts also occur during the mining of the raw materials used to manufacture GPUs, including polluting mining processes and the use of toxic chemicals for processing.

Market research firm TechInsights estimates that the big three (NVIDIA, AMD, and Intel) will ship 3.85 million GPUs to data centers in 2023, up from about 2.67 million in 2022. That number is expected to grow at an even greater rate in 2024.

Bashir says the industry is on an unsustainable path, but there are ways to foster responsible AI development that supports environmental goals.

This requires a comprehensive consideration of all of AI’s environmental and societal costs, as well as a detailed assessment of the value of the benefits it is likely to bring, his MIT colleagues argue.

“We need more contextualized methods to systematically and comprehensively understand the impact of new developments in this field,” Olivetti said. “The pace of improvements has been so rapid that we haven’t had a chance to keep up with our ability to measure and understand the tradeoffs.”

Để lại một bình luận

Email của bạn sẽ không được hiển thị công khai. Các trường bắt buộc được đánh dấu *