top of page
Search

Is AI Bad for the Environment? ChatGPT's Energy Use Explained

AI technology is everywhere, and it seems like it's here to stay. But what are the costs associated with these supercomputers? When we think of AI’s impact on the environment, electricity use and water use are 2 key factors. This article will dive into how and why AI has such a big impact on these resources and give you some examples to help make sense of it all.

Close-up of a green circuit board with microchips and intricate silver lines. Numbers and letters are visible, creating a tech-focused mood.

Electricity & Data Centers

“When we think about the environmental impact of generative AI, it is not just the electricity you consume when you plug the computer in. There are much broader consequences that go out to a system level and persist based on actions that we take,” says Elsa A. Olivetti, professor in the Department of Materials Science and Engineering and the lead of the Decarbonization Mission of MIT’s new Climate Project


Data centers are essentially where AI is housed. They are big temperature-controlled buildings with computer servers, drives, network equipment and other technology. And these data centers are used to train and run various learning models behind some of these AI platforms like Chat GPT.


There are some other lesser-known negative impacts at play here as well, as precious metals and other things are often necessary components to these computers. The rising demand for these data centers further contributes to mining non-renewable resources which could expedite some humanitarian problems down the line.


The difference between running generative AI compared to other programs is the density, or amount of power, it requires. A training cluster for AI might consume over 7 times more energy than that of a standard computing workload.  


The power requirements of data centers in North America has nearly doubled from 2022 to 2023 in part due to the increase in demands of generative AI. The electricity consumption of data centers rose to 460 terawatts globally in 2022 which made data centers the 11th largest electricity consumer in the world. For comparison, this would put data centers somewhere between the nations of Saudi Arabia (371 terawatts) and France (463 terawatts), according to the Organization for Economic Co-operation and Development.

The electricity consumption of data centers is expected to approach 1,050 terawatts by 2026 which would put data centers 5th on the list falling between Japan and Russia.


In a 2021 research paper, scientists from Google and the University of California at Berkeley estimated the training process alone consumed 1,287 megawatt hours of electricity and generated about about 552 tons of carbon dioxide. This is enough electricity to power over 100 average American homes for an entire year.


All machine models must be trained, so what makes generative AI so different?


Over different phases in the training process generative AI faces rapid fluctuations in the amount of energy used. These fluctuations could pose a threat to the power grid and as such, power grid operators need a way to absorb these fluctuations. Diesel-based generators are usually used for this task. 


For these reasons it is estimated that a ChatGPT query consumes about five times more electricity than a simple web search. 


This is on the conservative end as there have been plenty of articles detailing that an AI search can use up to ten times the amount of energy as a standard Google search.


Water Use

Going back to these data centers for a minute, this is where water use also comes into play. Cold water is used to cool down the equipment in these data centers, which makes no sense to me because obviously if I spilled my water on my laptop that would end in disaster, but apparently it is used because it is a much better conductor of heat than just air. according to the American Society of Civil Engineers:


"That water is used in two primary ways: indirectly, to generate the electricity that the data centers need to operate, and directly, as a liquid coolant to dissipate the heat generated by the servers and other data center equipment,"


"According to the United Nations Environmental Report, nearly two-thirds of our world's population experiences severe water shortages for at least one month a year, and by 2030, this gap is predicted to become much worse, with almost half of the world's population facing severe water stress. Already AI's projected water usage could hit 6.6 billion m³ by 2027, signaling a need to tackle its water footprint,"


Business Energy UK has some great visuals to help make sense of all these massive numbers, "Using OpenAI’s ChatGPT-4 model to generate a 100-word email alone sweats off more than an Evian bottle’s worth of water (519 millilitres), according to a recent study by The Washington Post (WaPo) and the University of California. And prompting is also a massive drain on the national grid. According to WaPo, the electricity used to generate that 100-word email is equal to powering 14 LED light bulbs for an hour (0.14 kilowatt-hours (kWh)),"


Sustainability Impact

As with anything related to the environment, there is another layer to this. We’ve talked about AIs direct electricity and water use, however, the demand for this technology from companies like Google and Microsoft is having a massive impact on their emissions and sustainability standards.  


Because these data centers require such a vast amount of resources, maintaining them is causing companies to reconsider and adjust their environmental targets which is putting us another step backwards.


Google for example used to have carbon neutral goals, and the companies revised goal is to no be net-zero by 2030. Carbon neutral and net-zero are not the same thing. In their own sustainability report Google acknowledges that this will be challenging due to the rising demand of AI. Also, let the record show that their emissions have increased by 48% since 2019 largely due to investments in AI. On the flip side, BIll Gates has been quoted saying he believes AI will result in more carbon reduction than energy demand.


Our Thoughts

AI is a miracle of the modern world, even the regular internet is such an amazing resource we are privileged to have. But it makes us uncomfortable that this privilege comes at the expense of other people and our environment. So although we may not feel we have much choice in being exposed to it and it becoming so integrated with technology and our jobs etc. we do have some say in how we interact with it.


Maybe don’t use it to type up a grocery list for you, just go have a look in your pantry. And think about switching to a different browser like Ecosia that doesn't use it to answer your search right away if that's something you feel inclined to do. Or save using chat for something you know would otherwise take you 10 Google searches to figure out. It's okay if it takes you more than 3 seconds to find an answer to something and its okay to use your brain and figure some things our on your own.


While it is an amazing resource, it is our job as consumers to be responsible and sustainable in how we interact with it as together we have the power to influence its supply and demand. There is no substitute for human interaction and connection.







Resources

 
 
 

Comentarios


We'd love to hear from you! Share your thoughts, ideas, and feedback with us.

© 2023 by Imperfectly Sustainable. All rights reserved.

bottom of page