- Altman rejects claims that ChatGPT uses gallons of water per query, but flags total AI energy demand.
- IMF reports 2023 data center power use matched Germany or France after ChatGPT launch.
- Human versus AI energy comparison sparks criticism from Zoho’s Sridhar Vembu.
OpenAI Chief Executive Officer Sam Altman has rejected claims that artificial intelligence systems consume excessive amounts of water per user query, calling them inaccurate while acknowledging that the broader energy footprint of AI infrastructure remains a legitimate issue.
Speaking on the sidelines of the India AI Impact summit in an interview with The Indian Express, Altman addressed criticism surrounding the environmental impact of large-scale AI systems. He said online claims that ChatGPT uses gallons of water for each prompt are “completely untrue” and have “no connection to reality.”
Concerns over AI-related water consumption stem largely from how data centers are cooled. Traditional facilities often rely on water to regulate temperatures and prevent overheating of electrical components. However, Altman noted that some newer data centers no longer depend on water-based cooling systems.
Despite technological improvements, projections indicate continued pressure on water resources. A report released last month by water technology company Xylem and Global Water Intelligence estimated that water drawn for cooling could more than triple over the next 25 years as global computing demand increases.
Altman dismissed the per-query water narrative but said energy consumption should be evaluated at a system-wide level. “Not per query, but in total, because the world is using so much AI,” he said, adding that faster deployment of nuclear, wind, and solar power would be necessary to meet rising demand.
Energy Comparisons Spark Pushback
During the discussion, Altman responded to earlier remarks by Bill Gates, who has argued that the efficiency of the human brain suggests that AI systems can become more energy-efficient over time. Altman countered that comparisons often overlook the energy required to “train” a human over decades through food consumption and daily living.
He said a more appropriate comparison would examine how much energy an AI model uses during inference, the stage when a trained model generates responses, versus the energy required for a human to answer a question.
Altman’s comments drew criticism from Sridhar Vembu, co-founder and chief scientist of Zoho Corporation. In a post on X, Vembu said he does not want to see technology equated with human beings.
Data Center Expansion Under Scrutiny
The debate comes as governments and companies accelerate investment in AI infrastructure. A May report from the International Monetary Fund found that global data center electricity consumption in 2023 reached levels comparable to Germany or France, shortly after the launch of ChatGPT.
Meanwhile, local resistance to new projects has emerged. Last week, the San Marcos City Council in Texas voted against a proposed $1.5 billion data center following months of public opposition tied to grid strain and electricity costs.
Related: PayPal Partners With OpenAI to Bring Instant Checkout to ChatGPT as Stock Surges
Disclaimer: The information presented in this article is for informational and educational purposes only. The article does not constitute financial advice or advice of any kind. Coin Edition is not responsible for any losses incurred as a result of the utilization of content, products, or services mentioned. Readers are advised to exercise caution before taking any action related to the company.