AI—Transformative Technology with an Outsized Footprint
- Rick Laezman
- Jun 20
- 8 min read
Can AI Change the World without Harming the Environment?
Artificial intelligence (AI) is taking the world by storm—and is causing a storm of controversy over its gargantuan electricity appetite and its insatiable thirst for fresh water to cool the heat its generation produces. Moreover, observers are alarmed by the pressure AI production is putting on aging grid infrastructure and the increased carbon emissions it can entail.
This latest form of “supercomputing” has the potential to transform society and has already seen widespread adoption across many industries. According to a survey conducted by market research firm McKinsey and Company, more than 70% of companies around the world reported using AI in at least one function in 2024.
AI is not like typical computing. It refers to the form of digital processing that mimics human intelligence. Tasks include language learning, image recognition, and even solving problems and making decisions, much like people do. Common forms of AI include self-driving cars, virtual assistants like Alexa, and chatbots like ChatGPT.
Unfortunately, smarter and faster computing has a downside. The tremendous computational power of AI has an equally outsized need for resources, including electricity and water. The consumption of these resources—and the waste that consumption generates—pose a unique challenge to the evolution and use of this new technology.

Companies that provide AI computing are taking steps to address this quandary so that the world can benefit from widespread adoption of AI without experiencing equally widespread environmental harm.
Why Is AI So Ravenous?
There are a number of reasons why AI consumes so much energy and poses such an environmental risk. Those reasons correspond to different stages of its use.
The tremendous computational power of AI has an equally outsized need for resources, including electricity and water.
First, AI models must be trained before they can perform the tasks that are demanded of them. Teaching the AI models to recognize patterns and queries requires feeding them massive amounts of data. This also consumes great amounts of energy.
The energy drain does not stop after AI programs have been trained. The second phase of AI is its implementation, or its use to answer queries and deliver other kinds of results. This also consumes massive amounts of computational power.
To put the energy use into perspective, consider how much energy is consumed by a typical Google search. According to a report by the International Energy Agency (IEA), a typical Google search might require about 0.3 watt-hours of electricity. In contrast, OpenAI’s ChatGPT uses about 2.9 watt-hours per request. That’s a nearly 10-fold increase.
The difference is even more remarkable on a macro scale. The IEA notes that when considering 9 billion searches daily, AI will require almost 10 terawatt-hours of additional electricity in a given year.
Hungry Hardware
AI’s energy appetite can also be traced to the hardware it requires. The massive computing tasks of AI can be done only with the use of specialized, power-hungry graphics processing units (GPUs) and tensor processing units (TPUs).

Traditional computers use central processing units (CPUs), which are not powerful enough for AI tasks. GPUs and TPUs are significantly more powerful, which makes them suited for the task. However, they also use much more energy than their less powerful cousins. The market research firm Newmark notes in a 2023 study that GPUs “require up to 15 times the energy of traditional CPUs.”
Furthermore, these AI-enabled computers are not solo desktops quietly humming in one’s home office. Instead, they are located in vast data centers filled with stacks of hardware. A 4 million-square-foot hyperscale data center, for example, is equivalent to 60 football fields. These centers have already grown in recent years to accommodate the increasing popularity of the internet and cloud computing. AI will fuel that growth even more.
Utilities and transmission grids will be challenged to meet the growing demand from data centers and AI.
The market research firm Goldman Sachs Research forecasts global power demand from data centers to increase 50% by 2027 and by as much as 165% by the end of the decade compared to 2023.
That statistic carries even more weight when the resource footprint of that increased demand is taken into consideration.
For example, increased demand for electricity requires increased generation. Utilities and transmission grids will be challenged to meet the growing demand from data centers and AI, and to balance that with existing demand from other customers and other uses. Grid delivery systems that are already challenged by aging infrastructure, the addition of new renewable power sources, and the stresses imposed by global climate change will be stressed even more by the demands of AI.
The IEA projects electricity demand from data centers worldwide to more than double in five years to around 945 terawatt-hours in the year 2030. AI will be the most significant driver of this increase, with electricity demand from AI-optimized data centers projected to more than quadruple by 2030. The IEA notes that 945 terawatt-hours is slightly more than the current annual electricity consumption of the entire nation of Japan.
AI’s Footprint
Generating additional energy to accommodate data centers and AI growth poses challenges beyond the energy grid. The energy industry is a long way from transitioning completely away from fossil fuels. Consequently, a major increase in energy generation also implies a huge increase in carbon emissions.
For example, a recent study, by the online hosting platform KnownHost, found that ChatGPT, which has over 164 million users a month, produces over 260,930 kilograms (287 tons) of carbon dioxide per month. That is equivalent to the emissions from 260 transcontinental flights between New York City and London.
Finally, AI’s environmental footprint can be accounted for in another, less obvious way. All those data centers filled with racks of energy-consuming hyper-processors generate tremendous amounts of heat. For them to function properly, data centers also require vast quantities of cooling water.
Water in data centers is used for cooling through various devices, including cooling towers, chillers, heat exchangers, and condensers. It is an essential process that ensures the proper ventilation of the facilities and the operation of the computers that are hosted there.
The amount of water used in a data center can vary significantly. According to digital-industry analysis firm Dgtl Infra, large facilities, like Google’s hyperscale data centers, use an average of 550,000 gallons (2.1 million liters) of water daily over the course of a year. That comes to approximately 200 million gallons (760 million liters) annually.
On the other end of the spectrum, according to the same source, smaller data centers in Prince William County, part of the Northern Virginia data center market, have an average daily water usage of about 18,000 gallons (68,100 liters), or 6.57 million gallons (24.9 million liters) annually.
Water consumption required by the [data center] industry … will put greater strain on the world’s already fragile freshwater resources.
Whether data centers are big or small, the additional water consumption required by the industry as it expands to meet the growing demands of AI will put greater strain on the world’s already fragile freshwater resources.
Making AI More Environmentally Friendly
Given the tremendous resource consumption and carbon output generated by the world’s increasing use of AI, what can be done to mitigate its impact?
Golestan (Sally) Radwan, the chief digital officer of the United Nations Environment Programme, says, “There is still much we don’t know about the environmental impact of AI, but some of the data we do have is concerning.” She adds a note of caution: “We need to make sure the net effect of AI on the planet is positive before we deploy the technology at scale.”
With that cautionary warning hanging overhead, how are AI providers taking steps to reduce its footprint?

The most obvious answer is to increase the use of clean power across the board and to make the energy grid more efficient. Building, energy, and utility managers can utilize the ability of AI to process vast amounts of data to facilitate decisions about generation, transmission, and distribution. Similarly, AI can be used to better analyze weather patterns.
AI users can also modify the models they use to limit excess consumption of data and therefore the consumption of power.
Many of the largest players in the AI data center industry are taking innovative action to minimize the environmental harm of AI by making the data centers themselves more energy efficient and green.
The market analysis company Data Center Frontier notes that many center operators are employing an approach it calls “Bring Your Own Power” (BYOP), which emphasizes energy independence and flexibility. It entails them managing their own electricity, often through renewable energy integration, microgrids, and other techniques.
For example, in Finland, data center providers are harnessing waste heat as a resource to help warm the homes of local residents. European AI provider Nebius Group has employed the technology at its flagship data center in Mäntsälä, Finland. In the 75-megawatt facility, waste heat generated by the stacks of GPUs is captured in water pipes and transferred to a facility that supports the local residential heating system. That heat is then transferred to individual homes.
A similar facility is being built in an area outside of Helsinki, Finland. It will help the nation meet its carbon reduction goals by reducing the need for fossil fuel–powered heating, although the contribution is relatively small, at 1% of the total goal.
AI and Renewables
Speaking to Forbes magazine, Jim Spencer, chief executive of Exus Renewables North America, says, “Renewables are essential for powering AI data centers.” He notes that “a massive increase in the adoption of renewables and utility-scale battery storage in the US” has already helped make the nation’s grid “better equipped” to handle the stresses of rising demand.
“Renewables are essential for powering AI data centers.”
Many companies are making that commitment. For example, last summer, tech giant Google signed a deal with renewable energy provider Energix Renewables that will provide electricity and renewable energy credits generated from solar projects to Google.
Earlier this year, energy firm AES Corporation announced it had signed three agreements with tech superpower Microsoft for solar projects in the Midwest totaling a combined 475 megawatts.
Still other companies are way ahead of their peers. The data center company Switch, which operates facilities in the US, has powered all of its data centers by renewable power since 2016.

On the subject of renewable generation, it is most effective when coupled with battery energy storage systems (BESS). Several leading companies are already reaping the benefits of BESS in their data centers. Meta (formerly Facebook), for example, has partnered with the Salt River Project and Orsted to integrate clean energy from the Eleven Mile Solar Center into its Mesa data center in Arizona. This project features a 300-megawatt solar farm paired with a four-hour battery energy storage system.
AI’s Potential Amid Challenges
Like so many new innovations in the technology sphere, AI holds great promise, while also posing significant challenges. AI has vast potential to change society’s relationship with computing. On the other hand, it also has the potential to exacerbate the already vexing problem of resource management in an increasingly delicate environment.
AI can help society address these issues, and AI companies can take steps to mitigate the footprint of AI itself. Many companies are already accepting the challenge.
*Rick Laezman is a freelance writer in Los Angeles, California, US. He has a passion for energy efficiency and innovation. He has been covering renewable power and other related subjects for more than 10 years.
Comments