top of page
  • MastodonIcon_FullTransparent
  • Bluesky_Logo
  • Instagram
  • Facebook
  • LinkedIn
Search

Generative AI

  • Writer: Tom Vermolen
    Tom Vermolen
  • 2 days ago
  • 4 min read

Generative AI’s environmental impact


Although AI has been around for quite some time, currently a new branch of AI is rapidly developing, generative AI. Generative AI has a substantially larger footprint than traditional forms of AI. This is not just a climate footprint, generative AI also scores worse on environmental and social justice issues.


Generative AI is a branch of AI that specializes in creating new content such as text, images, videos, or sound. Generative AI creates seemingly original content, in text or images, by being trained on large datasets.


It’s not to be confused with traditional AI, which is programmed with algorithms, or sets of rules, to accomplish specific tasks, such as playing chess, making recommendations, or generating search results.


It starts with manufacturing

The chips required for generative AI are more complex and require more resources than a regular chip used for traditional AI. An estimate is that a generative AI chip requires 50% more resources. Creating chips isn’t an efficient process, to manufacture 1 kilo of chips requires 400 kilos of raw materials. These raw materials are connected to environmental and humanitarian issues, as most raw materials are toxic and mined under deplorable worker conditions, child labor, and even slavery. There is a threat to oceans if deep-sea mining is used for extraction of the raw materials. During the manufacturing phase, many toxic chemicals are used to create the chips. It is estimated that the three major producers (NVIDIA, AMD, and Intel) shipped 3.85 million GPUs to data centers in 2023, up from about 2.67 million in 2022.


Training the model

Creating generative AI starts with training the model. It was thought that this would be the most energy-intensive phase of generative AI, but recent studies have proven this is not the case. Nevertheless, training an AI model like GPT-3 required 1287 megawatt hours of electricity, generating about 552 tons of carbon dioxide. While all machine-learning models must be trained, one issue unique to generative AI is the rapid fluctuations in energy use that occur over different phases of the training process. Power grid operators must have a way to absorb those fluctuations to protect the grid, and they usually employ diesel-based generators for that task.


Some C O 2 emission benchmarks, A flight from New York to San Francisco is about 1000 kilos of emissions, the average person in the U S creates about 5 kilos of emissions during their entire life, the emissions from a car over its lifetime including manufacturing is about 57 kilos of C O 2, training an AI model costs 283 kilos of C O 2 emissions.

Electricity demand during lifetime

Once a generative AI model is trained, the energy demands don’t disappear. Each time a model is used, perhaps by an individual asking ChatGPT to summarize an email, the computing hardware that performs those operations consumes energy. Researchers have estimated that a ChatGPT query consumes about ten times more electricity than a simple web search. Generating an image can have an impact up to 530 times bigger than a text search.


Did you know you can reduce your footprint by 90% by adding “ -ai” in a Google search? Just 3 simple letters, and your footprint reduces by 90%!


Scientists have estimated that the power requirements of data centers in North America increased from 2,688 megawatts at the end of 2022 to 5,341 megawatts at the end of 2023, partly driven by the demands of generative AI. Globally, the electricity consumption of data centers rose to 460 terawatt-hours in 2022. This would have made data centers the 11th largest electricity consumer in the world, between the nations of Saudi Arabia (371 terawatt-hours) and France (463 terawatt-hours), according to the Organization for Economic Co-operation and Development.


The demand for new data centers cannot be met in a sustainable way. The pace at which companies are building new data centers means the bulk of the electricity to power them must come from fossil fuel-based power plants. AI requires almost twice the power needed by The Netherlands, and the power required for AI is doubling every 100 days. By 2030, data centers are predicted to emit triple the amount of CO2 annually than it would have without the boom in AI development. The amount of GHG emissions predicted, 2.5 billion tonnes, equates to roughly 40% of the U.S’s current annual emissions.


Water usage

Data centres use water during construction and, once operational, to cool electrical components. The training of GPT-3 evaporated roughly 700.000 liters of clean fresh water, and every query made to ChatGPT used roughly a ¼ cup of water. AI-related infrastructure consumes six times more water than Denmark, a country of 6 million people. The water use for generative AI is estimated to grow exponentially into 2030, while a quarter of humanity already lacks access to clean water and sanitation. On a global scale, all datacenters combined use approximately 1.7 billion liters of fresh drinkable water every single day!


Short lifespan

Generative AI models have an especially short shelf-life, driven by rising demand for new AI applications. Companies release new models every few weeks, so the energy used to train prior versions goes to waste. New models often consume more energy for training, since they usually have more parameters than their predecessors.


End-of-life impacts

Lastly, another area in which generative AI can harm the environment is through e-waste, where one study found that the e-waste generation of generative AI will grow at a rapid pace – 16 million tons of cumulative waste by 2030. E-waste is one of the fastest-growing waste streams in the world. E-waste contains toxic heavy metals, such as mercury and lead, that contaminate our environment.


A picture of electronic waste.

Positive impacts of AI

AI, including generative AI, isn’t all bad. AI has helped advance medical research and even climate research, and AI can help find solutions to problems. Generative AI is already being used in a number of scientific fields where advances were made through the use of AI. It’s also being used to chart methane emissions, a potent greenhouse gas. Generative AI could be beneficial if used in a sensible way and on a small scale


More information can be found in these sources




This article was created using Real-Intelligence ;)









 
 
 

Comments


bottom of page