How the rise in AI impacts data centers and the environment

0
4
Renault confirms Google as preferred cloud partner

Source is ComputerWeekly.com

Since OpenAI launched ChatGPT in late 2022, there has been an AI boom across all tech industries that has greatly increased data center electricity consumption and demand.

Generative AI (GenAI) chatbots, like ChatGPT, use natural language processing technology to interpret prompts conversationally, which greatly lowers the user adoption barrier. This led to ChatGPT becoming an instant viral sensation, and in only two months, the software gained more than 100 million users.

This rapid growth is often credited as the acceleration point for public-facing AI projects. Since ChatGPT’s launch, other tech giants, including Google, Microsoft and Meta, have launched their own large language model chatbots, garnering even more users on a global scale. The electricity consumption of these technologies is extremely high, raising concerns about the environmental impact of AI and the overall energy use in data centers.

Take a deeper look at how the AI boom affects the environment, including how it uses energy, real-world impacts on the world and potential ways data centers can balance AI workloads while mitigating climate impact.

How AI uses so much power within the data center

The International Energy Agency (IEA) found that data centers and data transmission networks each account for 1% to 1.5% of global electricity consumption and 1% of energy-related greenhouse gas emissions. The energy demand strains electricity grids in many regions, and the resulting emissions harm the environment in various ways.

According to a report published in May 2024 by the Electric Power Research Institute (EPRI), electricity consumption by large data centers more than doubled between 2017 and 2021 — before the AI boom. Much of this growth was driven by commercially available digital services, like video streaming and communications applications. Now, the proliferation of AI is further fueling data center load growth.

AI workloads are more energy-intensive than other digital technologies. For example, the EPRI report estimated that traditional Google queries only use about 0.3 watt-hours each, while ChatGPT requests consume around 2.9 watt-hours each. That’s nearly 10 times the amount of electricity consumption. GenAI models that create images, audio and videos consume even more electricity per request.

AI must process vast volumes of data and conduct complex computational workloads, which is why it consumes so much more electricity than other digital technologies.

According to EPRI estimates, AI workloads use 10% to 20% of data center electricity. These statistics raise concerns as AI rapidly grows and expands in business and commercial sectors. EPRI industry analysts developed future scenarios for data center load growth with this in mind. They projected that data centers may consume 4.6% to 9.1% of U.S. electricity generation annually by 2030 versus an estimated 4% as of 2024.

To put this into perspective, developers are currently building new data centers with capacities reaching up to 1,000 megawatts, enough to power 800,000 homes, according to the EPRI report. EPRI identified three main factors that contribute to the high energy consumption of AI workloads:

  1. Model development. AI models must be developed and fine-tuned before training. This process uses approximately 10% of their energy footprint, according to EPRI.
  2. Model training. An AI algorithm must process large amounts of data to train a model. This process requires “substantial computation efforts and high energy expenditure for extended periods,” using about 30% of the energy footprint, according to EPRI.
  3. Utilization. Deploying and using a fully developed and trained AI model in real-world applications requires intensive computations, which, according to EPRI, accounts for around 60% of their energy footprint.

AI must process vast volumes of data and conduct complex computational workloads, which is why it consumes so much more electricity than other digital technologies. As these technologies mature and proliferate, they’ll likely grow more complex and demand more energy.

Examples of AI’s impact on the environment

IEA created the Net Zero Emissions by 2050 Scenario, which details a pathway for a global transition to clean energy that should limit global warming by 1.5 degrees Celsius.

The Intergovernmental Panel on Climate Change’s “Sixth Assessment Report” outlines the risks. According to IPCC, these risks include frequent extreme weather events, the loss of some entire ecosystems, exceptional heatwaves and more intense tropical cyclones. An increase in severe weather conditions will lead to extreme droughts, increasing flood hazards, and impacts on water and resource availability.

Increase in carbon emissions

A study by researchers at the University of Massachusetts Amherst estimated that training a large AI model could produce over 626,000 pounds of carbon dioxide equivalent. According to the university’s researchers, this is more than five times a car’s emissions over its entire lifetime.

Use of nonrenewable resources

According to a United Nations Environment Programme report, critical minerals and rare earth elements are used to create the microchips used to power AI. These materials are potentially finite and difficult to recycle, and they are often mined in environmentally destructive ways. The electronic waste they produce may also contain hazardous substances.

Increase in water usage

Data centers consume water to liquid-cool the hardware that runs AI applications. According to an article in Yale Environment 360, a user who engages with ChatGPT between 10 and 50 times causes a data center to consume half a liter of water. ChatGPT has millions of users, which means total water consumption can amount to hundreds of millions of gallons of water just to cool the equipment running AI.

These are just a few examples of the strain AI is placing on the environment. A careful approach and thoughtful strategies are required to keep these environmental impacts in check and to build toward a more sustainable future in the industry.

Potential future stats and scenarios

EPRI created scenarios that project the potential growth rate of data center electricity consumption.

The first scenario starts with a baseline of the average data center load in 2023, which equates to a little more than 150,000,000 megawatt-hours (MWh). In the highest growth rate scenario, EPRI projected an average data center load of more than 400,000,000 MWh by 2030, a staggering 166% change in growth. Conversely, the lowest growth rate scenario has a projected average data center load of less than 200,000,000 MWh by 2030.

While the higher growth rate scenario is intimidating, these are just projections, and much can change in the next decade. Some factors are out of our control, like consumer demand for AI technologies, but others can be controlled.

EPRI guidelines to mitigate AI’s negative impact

EPRI offers a few areas to focus on for data centers to curb their rising energy usage, keep load levels toward the lower end of the projected growth rate scenarios and mitigate the environmental impacts of AI workloads.

Operational efficiency and flexibility

A comprehensive strategy is necessary to meet rising energy demands and limit emissions growth. Strategies include investing in more energy-efficient processors and server architectures, leaning on virtualization to improve resource flexibility, adopting more effective cooling technologies, and using continuous monitoring and analytics to ensure optimal operational efficiency and better adaptability.

Collaboration through a shared energy economy model

Electric companies must balance resources between regular customers and data centers with accelerating and unpredictable load growth. To better handle this situation, data centers can collaborate more closely with electric companies to create a shared energy economy. For example, electric companies can utilize data center backup generators as a grid reliability resource, offering a more symbiotic relationship.

Load growth forecasting and modeling

With more accurate forecasting and modeling tools, data centers and electric companies can better collaborate to anticipate interconnection requests. This can help electric companies understand the full power demand that data centers require over time. By doing so, they can avoid stressing the energy grid and introduce flexibility into operational bandwidth and resource management.

Upgrades to the data center

To address the increasing demands of AI in data centers, administrators should consider adopting more flexible computational strategies and efficient server management tools. It’s essential to utilize advanced computational hardware, such as tensor processing units, field-programmable gate arrays and GPUs.

Admins should implement resource-efficient algorithmic techniques, like pruning and quantization. It is also crucial to transition to carbon-free and low-carbon electricity sources and adopt cleaner power systems.

Achieving the Net Zero Emissions by 2050 Scenario is still possible despite the massive energy demand to fuel AI workloads. However, the path ahead is narrow and will require global cooperation to ensure data centers can limit energy usage and consume electricity more sustainably, while powering the evolving technological landscape.

Jacob Roundy is a freelance writer and editor with more than a decade of experience with specializing in a variety of technology topics, such as data centers, business intelligence, AI/ML, climate change and sustainability. His writing focuses on demystifying tech, tracking trends in the industry, and providing practical guidance to IT leaders and administrators.

Source is ComputerWeekly.com

Vorig artikelGambling cloud provider bets on Nutanix and cools on VMware