The hype surrounding artificial intelligence continues. However, many people don’t know the downside: an average conversation with ChatGPT uses around half a liter of water. Microsoft, for example, is now looking for more sustainable solutions to conserve resources.
• AI requires an extremely large amount of water
• Microsoft’s water consumption increased sharply
• Search for more sustainable solutions
Artificial intelligence automates and optimizes work processes and aims to simplify people’s everyday lives. However, many people are not aware of the downsides of this trendy topic. Because, even if you wouldn’t think of it at first glance, AI uses an enormous amount of water. The reason: The computing processes that run in the background not only use a lot of electricity but also water to cool the devices.
Microsoft’s water consumption increased by over 30 percent
The latest environmental report from Microsoft also reveals extreme developments. Compared to the previous year, drinking water consumption increased by 34 percent in 2022. The US company used an impressive 6.4 billion liters of water last year. The Associated Press compares: The water the company will need in 2022 could fill 2,500 Olympic-sized swimming pools.
At Microsoft, too, this enormous water consumption is likely to be due in particular to AI, as the company is increasingly focusing on this topic. “It’s fair to say that most of the growth is due to AI,” says Shaolei Ren, a computer science professor at the University of California. A paper entitled “Making AI Less ‘Thirsty’: Uncovering and Addressing the Secret Water Footprint of AI Models” by Ren and colleagues shows that ChatGPT – the product from Microsoft’s partner OpenAI – is suitable for processing 20 to 50 Promptly requires around 500 milliliters of water. An average conversation via ChatGPT requires half a liter of water. The quality of this water must be particularly good in order to avoid corrosion and bacteria in the cooling circuit. Around 700,000 liters of water are said to have been used for training for GPT-3 alone, as can be seen from the preprint of the paper.
“Most people are not aware of the resource usage underlying ChatGPT. If you are not aware of the resource usage, we cannot contribute to resource conservation,” emphasizes Ren.
Microsoft & Co. looking for solutions
Contrary to the majority of consumers, the companies behind AI tools like ChatGPT are well aware of the enormous water consumption. When contacted by the Associated Press, Microsoft said it was working on ways to “make large systems more efficient, both in training and in use.” The company is also working on ways to measure AI’s energy and carbon footprint more precisely. “We will continue to monitor our emissions and accelerate progress while increasing the use of clean energy to power data centers, purchasing renewable energy and other efforts to achieve our sustainability goals of being carbon negative, water positive and zero waste by 2030,” Microsoft’s statement continues.
OpenAI made a similar statement: The company is aware that training large models “can be energy and water intensive.” Corresponding processes should be made more efficient in the future. There is “considerable thought” about the optimal use of computing power.
Editorial team finanzen.net
Selected leverage products on Microsoft
With knock-outs, speculative investors can participate disproportionately in price movements. Simply select the lever you want and we will show you suitable open-end products on Microsoft
The leverage must be between 2 and 20
Advertising