Energy and Environment in the Context of AI Development
- Thomas Yin
- Feb 13
- 4 min read
Updated: Mar 20

In each of the two industrial revolutions experienced by the world in past ages, great leaps in technological progress have been juxtaposed with similarly grand shifts in the biggest problems in the world, from worker’s rights and colonial sovereignty to whether the world’s richest man should be allowed to buy other companies. Oh wait, maybe the world hasn’t changed that much after all.
Jokes aside, several such questions arise in global discussions about what we should focus humanity’s resources on. Consistently, climate change is cited as one of the leading concerns faced by workers, governments, and scientists alike. Historically, our rapidly advancing technologies, starting in the mid 19th century, have had the inconvenient side effect of accelerating global warming through the release of greenhouse gases and pollutants. Yet AI may pose as an exception to this pattern – specifically, the usage of high-powered computing facilities (HPCs) necessary for AI development – have led to discussions about the complexities of determining whether AI, through raw efficiency, may offset the massive environmental footprint necessary to maintain it.
Energy Sink, Pollutant Faucet
Note: this section, as well as the succeeding section, contains information paraphrased from the Berkeley Lab 2024 United States Data Center Energy Usage Report and this Nature report on the Environmental Impact of Large Language Models.
Measuring how much energy an HPC facility uses can be complicated. While traditional server rooms, like the one hosting this website, are essentially racks of Central Processing Units (CPUs) much like the one your computer uses, AI-specialized facilities differ in a few significant ways:
HPC facilities often use Graphics Processing Units (GPUs) instead of Central Processing Units (CPUs) mostly due to the former’s higher performance in parallel processing and energy efficiency.
HPC facilities use higher-bandwidth storage and transfer methods due to the massive demands of AI data processing.
HPC facilities require special cooling systems as a result of the heat generated by massive arrays of GPUs.
HPC facilities use computing power based on fixed demands (completing a data processing task) instead of variable ones (500-20,000 users trying to access a website).
In effect, these differences mean that state-of-the-art AI computing centers are more expensive, more energy-demanding, and harder to maintain. AI data centers use massive amounts of electricity: in just under a decade, the average aggregate amount of power used by AI data centers tripled – from 60 TwH to 176 TwH as of 2024, representing about 4.4% of the total data consumption in the United States . This figure is only projected to increase, with conservative estimates being 320 TwH by 2028, in which case HPC facilities would take up about 7% of U.S. energy consumption.
Where is all this energy going? In a typical data center, the energy actually used to power the GPU arrays takes up a bit less than ⅔ of a facility’s total power usage; the rest of the energy goes into systems to help the hardware accelerators actually function – coolant, lighting, and temperature control systems are all necessary to keep a HPC center running. Even though the extra energy spent on these additional power sinks constitute an undesirable inefficiency in many U.S. data centers, scientists and engineers are constantly finding faster ways to reduce the additional consumption, in some cases bringing the additional usage down to just 17%.
Relative Efficiency?
AI (when used responsibly, of course) has a lot to contribute to our society. While many companies and workers have reported working faster and more efficiently with AI, Nature researchers wanted to estimate just how much work AI could save us by comparing the hypothetical costs of writing a 500-word page using AI tools compared to human labor. Ignoring the qualitative aspect of writing and accounting for time, economic cost, carbon emission, and water usage, the researchers found that state-of-the-art Large Language Models (LLMs) similar to Meta’s Llama-3 achieved a similar efficiency at that of between 40 to 150 American citizens, and that smaller, more energy efficient models similar to Google Gemma-2B produced work at about the efficiency of 130-1100 Americans.
So does this mean that we should start replacing human workers with AI? Certainly not after more consideration, concluded the researchers, who noted the obvious ethical concerns with the implications of their figures. Notwithstanding the towering gravity of worker displacement on global society and economy, widespread adoption of AI, as it stands, faces the inherent challenge of being unreliable when unsupervised. One of the most pressing flaws in the researchers’ model is the inability to account for the quality of the works produced by humans and by AI. Breaking this down, we see that, while AI has been shown to increase the yield of human workers, the boost in efficiency stops there – while the combination of AI systems and human ingenuity may produce desirable work, standalone LLMs cannot hope to produce reliable work unless, at a very minimum, directed by the thoughtful oversight of a human.
For a Greener Future
Regardless of the validity of current quantitative research about the short-run costs of developing AI, I am of the opinion that we must continue down the path of making better and better AI no matter what. Reiterating the point that this must be done safely and with the intention of benefitting a reasonably large portion of society, I contend that technological advancement is often the only way to solve the world’s problems. After all, it was genetic engineering that planted the seed for the Second Agricultural Revolution culminating in the mitigation of major food shortages in Southeast Asia, and vaccines that brought dozens of prevalent, pestilent diseases to their knees in a matter of years. Likewise, while current government and social movements to decrease major pollutants such as commercial agriculture, energy production, and transportation haven’t been able to demonstrate a significant breakthrough, future AI technologies will doubtlessly help us improve the efficiency of our current efforts, whether through accelerating current efforts to build methane trappers or enhancing the efficiency of renewable power sources. The possibilities are in the hands of humanity, and no one else.
Comments