Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124
Physical Address
304 North Cardinal St.
Dorchester Center, MA 02124

When people know I’m a journalist covering AI, they often ask about the high power consumption of AI data centers. Do these centers consume all our drinking water? How does this technology affect the environment? Will artificial intelligence kill us all? The questions range from the curious to the downright miserable.
Sam Altman, CEO of OpenAI, recently faced criticism after calling some of these concerns, especially those related to water, “completely false.” It all stems from a question and answer session Hosted by The Indian Express newspaper. About 26 minutes into the interview, Altman was asked to defend some of the criticisms of AI, including the amount of natural resources needed to run large language models like ChatGPT.
“(Criticism of AI overuse of) water is completely false,” Altman responded, saying that while excessive water use “was once true,” OpenAI no longer uses evaporative cooling. Estimates that 17 gallons of water are used per chatbot query are no longer accurate, he said.
He added: “This is completely untrue and completely crazy, and has nothing to do with reality.” He then goes on to address AI power consumption, calling the concerns “fair” but arguing that they should be evaluated as a whole, rather than per query, since some queries, Like videosare more dense in structure than text conversations. (Disclosure: Ziff Davis, the parent company of CNET, in 2025 filed a lawsuit against OpenAI, alleging that it infringed Ziff Davis’s copyrights in training and operating its AI systems.)
However, Altman says, “We need to move toward nuclear or wind and solar very quickly.”
The questions regarding data centers and water are complex.
Altman’s comments come amid ongoing, timely discussions about data centers and their energy uses.
CNET’s Corinne Cesarek I delved into the issue of energy use in artificial intelligence last year and found the cost of training and running ChatGPT, Gemini, Cloude and other generative AI tools to be “staggering.” The United States accounted for the largest share (45%) of global data center electricity consumption in 2024, according to International Energy Agency.
As for water: Google’s two data centers in Council Bluffs, Iowa, alone used 1.4 billion gallons of water in 2024, enough to fill about 28 million standard bathtubs. Google has it 29 data centers around the world. Meta data centers also represent approx 1.39 billion Gallons of water used in 2023
Although we don’t currently have statistics from OpenAI, Meta, or Google on their consumption of natural resources in 2025, it’s safe to bet that energy and water use in data centers will rise as more people use generative AI.
Considering that ChatGPT is now close to 1 billion users weeklyOpenAI estimates it handles approx 2.5 billion claims every dayThis is a huge amount of data to manage. Because of this demand, powerful computers that train AI models and process their claims are extremely hot. Think about how your phone and laptop get hot when running demanding tasks. If servers overheat, they can become slow or become corrupted. This is where the water comes in.
Traditionally, water is used in AI data centers in two ways: evaporative cooling (water consumption) and closed-loop systems (water recycling).
Evaporative cooling is a ventilation technology that uses the natural process of evaporation to convert liquid water into water vapor, which absorbs heat in the process. Closed-loop cooling is a more resource-efficient process as water is reused to dissipate heat without evaporation or consumption.
OpenAI said in a January announcement They “prioritize closed-loop or low-water cooling systems” to reduce water use. This lends credence to Altman’s recent claims that OpenAI’s water use isn’t as high as the 17 gallons per query estimate, but we don’t yet have exact numbers for OpenAI’s water use for 2025.
OpenAI says it is moving away from more expensive evaporative cooling systems. However, 56% of data centers still use this method in some form via closed-loop systems. According to January 2026 report From the International Water Technology Company Wood And market research company Global Water Intelligence. The research predicts that AI water consumption will rise by approximately 130% by 2050.
These statements from Altman come amid ongoing, timely discussions about data centers and their energy uses.
Running artificial intelligence and massive data centers is difficult.
AI generated chatbots use more power than traditional search engines like Google or Bing. By one estimate, one chatbot query Requires 10 times more electricity From Google search. On average, a single text query takes about 0.24 to 3 watt-hours, but AI-generated videos and images require more electricity.
Google August 2025 Report Details of Gemini’s energy use. “The average Gemini Apps text message uses 0.24 watt-hours of energy, emits 0.03 grams of carbon dioxide equivalent (gCO2e) and consumes 0.26 milliliters (or about five drops) of water,” the report notes. Google equates this energy consumption to running the microwave for 9 seconds.
Although AI models require 24/7 power, solar power is a viable and scalable option for powering AI data centers.
OpenAI announced Multi-billion dollar project in October 2025 to explore new energy generation using solar energy and battery storage. Meta, Microsoft, Google, and Amazon have expanded the use of solar energy across the United States in 2025.
While renewable solutions could be the way forward, solar (or wind) is still just part of the power generation mix used by data centers. They generally rely on the network itself, which is still very much the case Powered by burning fossil fuels Such as natural gas.
The conversation around artificial intelligence and water use is shifting from unsubstantiated claims to thoughtful scrutiny. Communities and policymakers are now pushing for transparency and sustainable practices, with the goal of ensuring that the rapid growth of AI does not come at the expense of local water resources or the local electricity grid. As artificial intelligence continues to grow, the discussion about how best to balance technological innovation and environmental responsibility will grow as well.