Ever since its introduction in November last year, the intense buzz around Open AI’s ChatGPT has skyrocketed everyone’s interest in generative AI tools. This has spurred the tech titans like Google and Microsoft to announce their plans of integrating generative AI tools with their respective search engines and Baidu, the Chinese search engine has no plans of giving up either.
Image Credits: Wikimedia Commons
But, the excitement for these innovative technologies might be masking a dark secret about the emissions that result from these tools.
How ChatGPT-like tools are different from a Google search?
ChatGPT-like tools are based on large language models(LLM) which are trained to comprehend questions and respond in a human-like manner. Enormous data is fed to them which enables them to generate interactive texts similar to how a human would respond.
On the other hand, a traditional search engine provides results by indexing relevant data and recognising a pattern of matching keywords. But in doing so, it inevitably gives a lot of data that the user has to skim through to find out exactly what they are looking for. On the other hand, by utilizing the large dataset, AI provides the user with a much more personalised and direct answer.
Image Credits: Pixabay
Impact of Integrating AI on climate
As per Alan Woodward, a professor of cybersecurity at the University of Surrey in the U.K., already huge amounts of resources are involved in running search engines but integrating AI into them would require a different kind of firepower that will be needed for processing as well as storage and efficient searching.
While OpenAI has not revealed any numbers thus far, third-party estimates suggest that the training of GPT-3, on which ChatGPT is based, consumed 1,287 MWh of electricity and resulted in emissions of more than 550 tonnes of carbon dioxide.
Simply training Large Language Models(LLM) consumes a huge amount of computational power. Executing it and then serving it to billions of users is assumed to take up to four or five folds of the computing power per search at the least. It is hardly a surprise that only tech giants with sizable resources are looking to get this done.
The infrastructure and the data centres we currently have will not be able to keep up with the race of generative AI requiring significant investment in hardware as well. According to the International Energy Agency, data centres already produce one percent of the world’s greenhouse gas emissions. The demand for cloud computing is anticipated to increase that, although search engine providers have pledged to lower their net impact on global warming.
Image Credits: Pixabay
The competition to develop high-performance, AI-powered search engines is going to call for a sharp increase in computing power, which will inevitably result in a significant spike in the quantity of energy needed by tech businesses and the carbon they produce.
Moving data centres to cleaner energy sources and redesigning neural networks to become more effective could both reduce the environmental impact and energy cost of integrating AI into search by a thousand times. This would also decrease the so-called “inference time,” or the amount of computing power needed for an algorithm to work on new data.
Whether the increased computing power in exchange for search accuracy is worth the hassle or not, can be a topic of debate. The model may work for the end consumer as earlier the large language models weren’t accessible to everybody but the carbon footprint left behind every search and its overall impact on the climate is something to think about.
If you enjoyed reading our articles, please consider supporting us by buying our geeky merchandise on Instagram.