Tech Firms are Concealing Energy and Water Demands of AI Computing

| Updated on August 30, 2024

Reports say that every time you use ChatGPT, the environment compensates for it. 

A query on an AI chatbot requires 10 times more electricity than a standard search on Google. However, someone using an AI bot has no means to know this energy consumption as these are processed in companies’ data centers.

Alex de Vries, the founder of Digiconomist, says the rising energy demand for AI technologies will surely require the world to waste more climate-warming oil, gas, and coal.

He said, “Even if we manage to feed AI with renewables, we have to realize those are limited in supply, so we’ll be using more fossil fuels elsewhere.”

Shaolei Ren, associate professor of electrical and computer engineering at UC Riverside, and his colleagues calculate that ChatGPT gulps roughly a 16-ounce bottle in as few as 10 queries.

To prevent the consequences of such consumption, the experts are calling out to other companies about how much water and energy their queries consume.

Ren said that users must know how much energy and water their queries will require. He said that it will help people to make informed decisions.

Furthermore, the specialized computer chips required for generative AI use more electricity and release excessive heat, which eventually calls for power and water to keep them cool.

Following these claims, Google didn’t respond, while OpenAI replied with a prepared statement that didn’t answer the main question.

Although three years ago, Google vowed to reach net-zero carbon emissions by 2030, it isn’t making any progress. Google said that it expects the emissions to rise and stop in the future, though that seems way too far.

De Vries said he was disappointed with Google, as they withheld the information while they were perfectly able.

Jemima Hunter

Tech Journalist