Around the world there are dozens of vast buildings that hum relentlessly — day and night. From the outside they look pretty unremarkable, just nondescript warehouses surrounded by security fences and security personnel. But inside buildings like these, there are thousands of servers tirelessly processing data, spinning the artificial intelligence web that has become intertwined with our daily lives.
Around the world there are dozens of vast buildings that hum relentlessly — day and night. From the outside they look pretty unremarkable, just nondescript warehouses surrounded by security fences and security personnel. But inside buildings like these, there are thousands of servers tirelessly processing data, spinning the artificial intelligence web that has become intertwined with our daily lives.
These buildings are data centres, the beating heart of powerful AI technology models like OpenAI’s ChatGPT and Google’s Gemini. While this type of technology is quickly blowing away the internet as the most important advancement of the modern age, offering unprecedented convenience and capabilities, they come with a hefty, often invisible, environmental price tag.
Generating just 100 words using advanced AI models like GPT-4 can consume the equivalent of three bottles of water
Typing a simple query into an AI chatbot might seem inconsequential, yet as an article by Tom’s Hardware points out, generating just 100 words using advanced AI models like GPT-4 can consume the equivalent of three bottles of water. (For those scratching their heads with the connection, water is used to cool the servers in data centres, preventing them from overheating as they handle massive computational loads).
This concerning number only grows worse if you factor in development; training a single large AI model such as Chatgpt-3 uses over 700,000 litres of water — which is enough to fill over 4,600 household bathtubs. This number becomes even more alarming when you consider that with hundreds of tech companies and research institutions worldwide racing to advance AI capabilities, hundreds of large models are being trained continually. The total number then amounts to tens of millions of litres of water used each year solely for AI training — a volume comparable to the annual water consumption of thousands of households, (not including the amount needed to refine said models and actual usage). This cumulative consumption intensifies the competition for water resources, especially in regions already facing scarcity; local communities are finding themselves competing with these digital giants for a resource that’s fundamental to life, underscoring the urgent need for more sustainable AI practices.
In 2023 alone, the reported energy consumed by data centres worldwide was roughly 200 terawatt-hours — comparable to the annual electricity consumption of some mid-sized countries.
It’s not just water that is under threat either, as a recent Washington Post article revealed the startling fact that generating the same single 100-word AI response can also consume as much as 0.14 kilowatt-hours of electricity, equal to running 14 LED light bulbs for an hour. Plus the data centres themselves — which run 24/7 — require enormous amounts of actual power to operate their high-performance servers alongside the water needed to cool them. Indeed in 2023 alone, the reported energy consumed by data centres worldwide was roughly 200 terawatt-hours — comparable to the annual electricity consumption of some mid-sized countries. The issue becomes even more pronounced when considering projections; a report by Goldman Sachs predicts that by 2030, the energy demand from new AI data centres will grow by 160%.
This rapid expansion and ever-increasing power demands not only poses a real threat to local communities, but has also raised concerns about the sustainability of AI growth. Experts warn that the energy grid — already under stress from a variety of factors including increased electric vehicle usage and climate change — might not be able to keep up.
The good news is that there are many ways that a countermeasure can be achieved, with the first being to make AI itself part of the solution. As highlighted in a recent Tech Brew article, AI has the potential to optimise grid operations, manage congestion, and even predict the output of renewable energy sources like wind and solar. For example, it can help balance the grid by rerouting power during peak demand or shifting non-urgent computational tasks to times when renewable energy is more plentiful, giving back some of the energy taken by AI.
Many companies themselves (Including some of the heavy hitters) are also taking steps to mitigate the environmental impact of AI technology. Google has been using its DeepMind AI to improve data centre efficiency, reducing cooling costs by up to 40% and showcasing how AI can lower energy consumption on a large scale. Meanwhile, independent companies are doing their part. Xylem in the UK for example, is employing AI-powered solutions to optimise water infrastructure, reducing water loss and improving distribution efficiency.
Research institutions are doing their part too. MIT’s Lincoln Laboratory has developed techniques like power-capping and early stopping during training, which can cut energy use by up to 80%. They have also created a tool to match AI models with the most suitable hardware, reducing energy consumption during inference by 10–20%. Additionally, research at Harvard is exploring methods like “mixture of experts,” which activates only necessary parts of large models, further reducing the resources needed for training. The bad news is that whilst these efforts are commendable and a great start, by themselves they aren’t likely to be enough to offset the mounting issues, and a more fundamental change is required to make a true difference.
Thankfully, there is an option that has the power to drastically reduce AI’s environmental footprint: the adoption of decentralised AI platforms. Traditional AI relies on these aforementioned massive, centralised data centres to process and store vast amounts of data — in contrast decentralised AI distributes these tasks across a network of smaller, geographically dispersed nodes. This model not only reduces the need for large, power-hungry facilities but also makes better use of existing resources.
NetMind.AI is one of the pioneers in this emerging space, offering a decentralised AI platform that leverages idle GPU resources from all around the world. By tapping into underutilised computing power, NetMind.AI’s platform reduces the need for new data centres and, consequently, the energy and water required to cool them. Additionally, because decentralised AI can operate closer to the data source, it cuts down on the energy costs associated with data transfer and storage. An added bonus is that those supplying the computational power the companies training AI on can earn NMT tokens, which can be exchanged for fiat money, creating a lucrative earning opportunity.
As decentralised platforms become more frequent, it is possible that they could be integrated with renewable energy sources. For example by distributing AI workloads to GPUs located in areas with abundant solar or wind power, they can significantly reduce the carbon footprint of AI operations in general. This flexible, location-independent model optimises energy use as well as creates a sustainable path forward as AI continues to grow in both usage and demand.