The AI Appetite and Environment

You know how every time something new comes along, the older generation groans about its existence? Every innovation is branded as a degradation of what was before, even if that’s not true. Can the same logic be applied to artificial intelligence and its impact on the planet? We’re spearheading towards an era where the planet is the price of convenience, and everybody seems to be a silent spectator because it doesn’t affect them… yet.
The race to build more intelligent systems and larger AI models has come at an ecological cost that is often underestimated. While the world marvels at the brilliance of machine intelligence, few ask: What is the ecological cost of keeping AI alive?
The Hidden Thirst of AI
The allure of AI models is unquestionable. Artificial intelligence is driving innovation and accelerating research as never before. However, behind this convenience lies an environmental undercarriage that we tend to ignore.
It takes enormous power to train an AI model. Each algorithm requires billions of computational cycles, which consume large amounts of electricity. It is estimated that training large-scale models such as GPT-3 by OpenAI required approximately 1,300 megawatt-hours (MWh) of energy. In comparison, the per capita electricity consumption in India is 1395 kWh. This is the standard of an industry that is driving its systems to be quicker, smarter, and bigger.
The problem is not just limited to electricity, though. A less noticeable yet equally urgent concern is the issue of fresh water for AI models. All communications with a model such as GPT-3, every text generated, image produced, or response delivered, indirectly consume water. Research shows that approximately half a litre of water is utilised over 10-50 responses created by such models. This translates to an astronomical amount of water footprint when multiplied by billions of users around the globe.
The Cooling Method Problem
Due to the high-density computing hardware, AI data centres produce significant heat, so they need to be cooled, which, if left unmanaged, could result in hardware damage. The challenge is the cooling method used. A vast majority of large data centres use evaporative cooling systems that keep the temperature down with the help of fresh water. These systems operate by allowing water to evaporate, thereby carrying heat away from servers and preventing overheating. While this method is efficient, the evaporated water does not return to the local ecosystems. The diversion of freshwater for digital operations in drought-prone areas may exacerbate water scarcity for communities that already struggle to meet their basic needs.
A 2023 study found that training the GPT-3 language model alone may have consumed approximately 700,000 litres of clean freshwater. To put this in perspective, if an adult consumes 3 litres of water daily, that amount would equal the daily water intake of about 233,333 people. This parallel serves as a stark reminder that the virtual conveniences we rely on so heavily are supported by minimal physical resources.
Digital Expansion and Material Exploitation
According to Linda Oniwe, Vice President for Technology, Media and Telecommunication, Standard Bank, there is a strong demand that favours Africa as a destination for data centres. She further states that there has been growing interest from major global cloud service providers, such as AWS, Google Cloud and Microsoft, as well as content delivery network providers, over the last few years, which continues to drive demand for data centre infrastructure. This reflects the continent’s growing importance in the global digital economy.
What is often forgotten in these conversations about the continent’s future is that several African countries face water shortages, unreliable electricity, and vulnerable environmental conditions. Setting up energy-intensive and water-dependent data centres in these areas could worsen these issues, especially if sustainable practices are not prioritised.
Beyond the data centres, the physical aspects of AI present an even more troubling picture. The production of advanced semiconductors and lithium-ion batteries relies greatly on cobalt. This mineral is mostly mined in the Democratic Republic of Congo. The DRC continues to face the pressure of global technological demand as many miners, including children as young as six, work in dangerous conditions to extract these materials. It is estimated that over 40,000 children are involved in cobalt mining, putting their lives at risk daily for 12 hours a day to support a digital revolution that they cannot access. Each advanced AI-powered device or application is linked to a harsh reality of labour, and human cost.
The Future but at What Cost?
There is no denying that AI represents progress. It simplifies lives, transforms industries, and holds great promise for human advancement. But now we must ask not how far AI can go, but how much it takes to get there. The world praises AI for its intelligence, yet pays little attention to its demands. The same systems that promise to improve sustainability and efficiency are quietly straining the very resources they claim to protect.
In recent years, several tech giants have started investing in underground bunkers. When I first noticed this trend, I wondered why the visionaries of progress were preparing for disaster. Now, it makes sense. These individuals often talk about building for the future, but what kind of future requires a bunker? They claim to be advancing humanity, yet their actions reveal a deeper truth. They are actively damaging the present while simultaneously preparing for doomsday.
And perhaps that is the cruel irony of artificial intelligence. Those who created it to outsmart humanity are now outsmarting themselves. They are building walls underground to escape the consequences above.
