Generative AI is the recent latest technology behind chatbots and image generators. But how hot is it making the planet?

As an AI researcher, I often worry concerning the energy costs of constructing artificial intelligence models. The more powerful the AI, the more energy it takes. What does the emergence of increasingly more powerful generative AI models mean for society’s future carbon footprint?

“Generative” refers to the power of an AI algorithm to supply complex data. The alternative is “discriminative” AI, which chooses between a set variety of options and produces only a single number. An example of a discriminative output is selecting whether to approve a loan application.

Generative AI can create way more complex outputs, akin to a sentence, a paragraph, a picture or perhaps a short video. It has long been utilized in applications like smart speakers to generate audio responses, or in autocomplete to suggest a search query. However, it only recently gained the power to generate humanlike language and realistic photos.

Using more power than ever

The exact energy cost of a single AI model is difficult to estimate, and includes the energy used to fabricate the computing equipment, create the model and use the model in production. In 2019, researchers found that making a generative AI model called BERT with 110 million parameters consumed the energy of a round-trip transcontinental flight for one person. The variety of parameters refers to the scale of the model, with larger models generally being more expert. Researchers estimated that creating the much larger GPT-3, which has 175 billion parameters, consumed 1,287 megawatt hours of electricity and generated 552 tons of carbon dioxide equivalent, the equivalent of 123 gasoline-powered passenger vehicles driven for one yr. And that’s only for getting the model able to launch, before any consumers start using it.

Size is just not the one predictor of carbon emissions. The open-access BLOOM model, developed by the BigScience project in France, is analogous in size to GPT-3 but has a much lower carbon footprint, consuming 433 MWh of electricity in generating 30 tons of CO2eq. A study by Google found that for a similar size, using a more efficient model architecture and processor and a greener data center can reduce the carbon footprint by 100 to 1,000 times.

Larger models do use more energy during their deployment. There is proscribed data on the carbon footprint of a single generative AI query, but some industry figures estimate it to be 4 to 5 times higher than that of a search engine query. As chatbots and image generators grow to be more popular, and as Google and Microsoft incorporate AI language models into their search engines like google and yahoo, the variety of queries they receive every day could grow exponentially.

AI chatbots, search engines like google and yahoo and image generators are rapidly going mainstream, adding to AI’s carbon footprint.
AP Photo/Steve Helber

AI bots for search

Just a few years ago, not many individuals outside of research labs were using models like BERT or GPT. That modified on Nov. 30, 2022, when OpenAI released ChatGPT. According to the newest available data, ChatGPT had over 1.5 billion visits in March 2023. Microsoft incorporated ChatGPT into its search engine, Bing, and made it available to everyone on May 4, 2023. If chatbots grow to be as popular as search engines like google and yahoo, the energy costs of deploying the AIs could really add up. But AI assistants have many more uses than simply search, akin to writing documents, solving math problems and creating marketing campaigns.

Another problem is that AI models must be continually updated. For example, ChatGPT was only trained on data from as much as 2021, so it doesn’t find out about anything that happened since then. The carbon footprint of making ChatGPT isn’t public information, but it surely is probably going much higher than that of GPT-3. If it needed to be recreated frequently to update its knowledge, the energy costs would grow even larger.

One upside is that asking a chatbot generally is a more direct approach to get information than using a search engine. Instead of getting a page stuffed with links, you get a direct answer as you’ll from a human, assuming problems with accuracy are mitigated. Getting to the data quicker could potentially offset the increased energy use in comparison with a search engine.

Ways forward

The future is difficult to predict, but large generative AI models are here to remain, and other people will probably increasingly turn to them for information. For example, if a student needs help solving a math problem now, they ask a tutor or a friend, or seek the advice of a textbook. In the long run, they may probably ask a chatbot. The same goes for other expert knowledge akin to legal advice or medical expertise.

While a single large AI model is just not going to smash the environment, if a thousand firms develop barely different AI bots for various purposes, each utilized by tens of millions of consumers, the energy use could grow to be a problem. More research is required to make generative AI more efficient. The excellent news is that AI can run on renewable energy. By bringing the computation to where green energy is more abundant, or scheduling computation for times of day when renewable energy is more available, emissions might be reduced by an element of 30 to 40, in comparison with using a grid dominated by fossil fuels.

Finally, societal pressure could also be helpful to encourage firms and research labs to publish the carbon footprints of their AI models, as some already do. In the long run, perhaps consumers could even use this information to decide on a “greener” chatbot.

This article was originally published at theconversation.com