Sam Altman Challenges AI Resource Narrative: Human Training Costs vs. “Fake” Water Concerns
The debate on artificial intelligence and the environment has reached a new tipping point. In a recent interview, Sam Altman, the CEO of OpenAI, strongly rejected the notion that AI systems consume massive amounts of water and energy.
His statements have reignited debate about the real cost of AI and how society should measure it.
In an interview with The Indian Express at an AI summit in India, Altman rejected the viral reports that each query on a chatbot consumes massive amounts of water. He termed estimates like “17 gallons per query” as impractical and not reflecting the current reality of data centres.
He said that water consumption was a problem in older data centres that used evaporative cooling. However, newer data centres use alternative cooling designs that reduce water consumption.
His statements come at a time when the focus on data centres continues to increase. Data centres are the backbone of cloud computing and AI systems, and they require massive infrastructure in terms of servers, storage, networking equipment, and cooling systems that prevent overheating.
Cooling is one of the biggest operational challenges because the chips used to train and execute AI models produce massive amounts of heat.
Sam Altman on Water, Watts, and the Human Cost of Knowledge
Some firms have started to redesign their infrastructure to minimise the impact on the environment. In 2024, Microsoft unveiled a data center design that consumes no water in the cooling process.
This is because the firm’s engineers employ closed-loop systems and sophisticated airflow design. These designs indicate that the industry is attempting to find a way to accommodate growth within resource constraints.
However, the utilities industry and water companies predict that the demand for water could surge with the development of AI. This is because experts forecast an increase in water withdrawals associated with data centre cooling over the next few decades. This is due to the rapid development of computing capacity.
Sam Altman was more forthcoming about energy concerns than water issues. He stated that energy consumption is a valid concern and that the answer to this problem is to increase the development of clean energy sources like nuclear, wind, and solar power. AI firms, according to Altman, will require plenty of low-carbon electricity if the technology is to continue growing.
The interview also went back to the common analogy that Bill Gates made once, which is that a single question asked of a chatbot could use up energy comparable to the charging of a smartphone several times. Sam Altman dismissed this analogy by saying that the actual energy usage per question is much lower once the model has already been trained.
The most contentious part of his argument was when he compared the training of an AI system to the development of a human.
Altman said that people who criticise the use of AI systems tend to concentrate on the energy used to train the AI system while paying no attention to the energy used to educate and maintain humans for several decades.
Navigating the High Power Demand of Artificial Intelligence
Food production, education, and the accumulation of knowledge over several centuries all use up energy, he said. A proper comparison would be to see how much energy an AI system uses to answer a question after training, compared to how much energy a human uses to come up with a similar answer.
This comparison made many people uncomfortable. Critics point out that human life cannot be boiled down to an energy calculation and that AI companies still have a responsibility for managing the environmental impact.
This backlash is also a symptom of a broader concern about automation and employment, particularly when leaders in the AI industry state that entire categories of jobs are likely to shift or disappear.
Energy data explains why this concern is so pressing. The International Energy Agency estimates that data centres currently account for 415 terawatt-hours of electricity consumption per year, or 1.5 per cent of the world’s energy demand.
This is likely to rise to 945 terawatt-hours by 2030, with AI accounting for much of this growth. In the United States, the U.S. Department of Energy estimates that data centres could account for as much as 12 per cent of the country’s electricity use in a few years.
The debate about AI’s footprint is not likely to subside. “Efficiency gains and clean energy are just the beginning,” say AI enthusiasts, who point to these benefits as evidence that technology can scale sustainably.
“Efficiency gains may not grow fast enough to meet surging demand,” warn critics. Altman’s comments raise an underlying question: How will society balance the benefits of AI with the resources it uses as AI becomes increasingly integral to our world?
Comments are closed.