Last month at the LegalESG conference, I chatted with a couple women from McKinsey & Co., one of whom specializes in AI. I learned from her that AI runs on a new generation of hardware that requires a new type of cooling system. A couple weeks ago in rural West Texas, I drove past a brand new, nondescript group of buildings surrounded by large cotton fields. At first glance, it looked like public storage units. I found out days after that it is actually an AI data center, being filled with that new hardware. Then last week, MITSloan Management Review published an article about the sustainability impact of AI’s growth, which is largely ignored.
“AI has a fast-growing carbon footprint, stemming from its voracious appetite for energy and the carbon costs of manufacturing the hardware it uses… Large language models (LLMs) require tens of thousands of cutting-edge high-performance chips for training and for responding to queries, leading to high energy consumption and carbon emissions. The greater the model’s complexity, the more task times increase, resulting in more energy consumption. LLMs like ChatGPT are among the most complex and computationally expensive AI models. The capabilities of OpenAI’s GPT-3 LLM are made possible by its 175 billion-parameter model, one of the largest when it was launched. Its training alone is estimated to have used 1.3 gigawatt-hours of energy (equivalent to 120 average U.S. households’ yearly consumption) and generated 552 tons in carbon emissions (equivalent to the yearly emissions of 120 U.S. cars). OpenAI’s latest model, GPT-4, is rumored to be 10 times larger…
Data storage, CPU operation, and chip operation consume most of the energy in data centers. Furthermore, around 40% of the electricity used in data centers is powering large air conditioners, which are necessary to keep servers cool and operating correctly. Falcon 180B, a recently launched open-access LLM, has 180 billion parameters (similar to GPT-3’s count) and was trained on a 3.5-trillion-token data set (compared with GPT-3’s 499 billion tokens). Training this model on such a large data set is estimated to have generated an estimated 1,870 tons of carbon emissions, equivalent to heating 350 U.S. households for a year, assuming a typical U.S. energy mix.”
The article also touches on water consumption and e-waste matter as well. AI’s update and growth appears unstoppable at the moment. While there may be benefits to the technology, there are also costs – not all of which are readily apparent. Companies should take a broad view of their AI use plans in relation to climate and other sustainability goals to prevent surprises.
If you aren’t already subscribed to our complimentary ESG blog, sign up here: https://practicalesg.com/subscribe/ for daily updates delivered right to you.