The Environmental Impact of Artificial Intelligence
Photo is a small data center in a built-up neighborhood, originally a 1930's commercial building repurposed as a data center; while many large data centers are new construction, adaptive reuse of obsolete or under-utilized commercial/industrial buildings is common.
Ever since ChatGPT was released in late 2022, Artificial Intelligence (AI) has been top of mind. Most of the conversation has revolved around its capabilities and utility, and the associated ethical, legal and societal risks associated with its use. Something else doesn’t get addressed as much, however – the environmental consequences of increased AI use.
The increased use of AI is directly related to the growth of data centers, specialized buildings housing networks of computing and data storage systems. While data centers have been around for decades, the development of generative AI by large tech companies for consumer use, along with its continued use for enterprise purposes, has caused demand for data centers to skyrocket. Large data centers are capable of using as much electricity as a mid-sized town, or enough to power approximately 750,000 homes.
Specialized chips for AI computing are designed to quickly read vast quantities of data, requiring a proportionately large amount of electricity, also generating so much heat that additional resources (power and water) are required to keep them cool.
It is no secret that an AI search uses far more energy than a standard Google search, but since May 2024, a standard Google search automatically includes an AI-generated response as the top hit. It is ironic that Google, which vowed in 2018 to be net zero by 2030, has increased its greenhouse gas emissions nearly 50% since 2019, and consumed over six billion gallons of water in 2023. Google isn’t alone – Amazon, Microsoft and Meta also have data centers using trillions of watt hours annually.
According to GridClue, electricity demand in the United States is projected to double by 2050; while increased electrification to combat carbon emissions is part of the story, data centers used for AI (and for cryptocurrency mining) are accounting for an increasing share of this demand. The worsening imbalance in electricity supply/demand is particularly acute in California, which is phasing out fossil fuels in favor of non-polluting energy sources (primarily solar, wind and geothermal). While nuclear power is sometimes offered as an clean energy alternative (Three Mile Island is expected to reopen in 2028 to power Microsoft data centers), constructing new nuclear plants is economically questionable due to high costs and long lead times, posing additional challenges from accidents (like the partial meltdown of one of the reactors at Three Mile Island in 1979) and the disposal of radioactive waste.
In Santa Clara, the hub of Silicon Valley, data centers presently account for over 50% of the city’s electrical output. Nationwide, the power consumption of data centers is expected to more than double by 2030, and a recent report by the International Energy Agency (IEA) indicates that data centers have the potential to double their energy usage by 2026. Some predictions suggest that 8-10% of worldwide electricity production will be needed to sustain the growth of data centers.
Against this background, DeepSeek, an AI company owned and funded by a Chinese hedge fund, started an artificial general intelligence lab in mid-2023, developing DeepSeek as a large language model (LLM), and releasing a free chatbot app in late January 2025 that by all accounts performs as well as ChatGPT and other generative AI platforms . What is unique about DeepSeek?
It is open source
It was developed at a fraction of the cost of ChatGPT and other American AI platforms, and
Importantly, training innovations allow DeepSeek to operate with far less computing power that drastically reduces the resources needed to operate
This revelation caused a major shakeup in the U. S. stock market, particularly among big players in the growth of AI like Nvidia (which manufactures chips used in AI servers) and Constellation Energy (which owns Three Mile Island); both stocks are still depressed, two weeks after the public release of the DeepSeek chatbot.
While widespread use of DeepSeek might be limited by its compliance with the Chinese government's data collection policies, it seems to have proven that generative AI can operate more efficiently. This may or may not change the resources devoted to AI going forward, as cheaper technology often results in wider adoption and greater usage (Jevons Paradox). The "unimaginable amounts of compute" (a term used in a recent Planet Money podcast) required by generative AI has consequences beyond obvious ethical, legal and societal risks, at a time when climate change continues to wreak havoc worldwide.