Sam Altman’s ‘Fake’ Defense: Inside the AI Resource Wars

At the India AI Impact Summit last week, Sam Altman faced a question that has become increasingly uncomfortable for AI leaders: what about the resources? The OpenAI CEO had spent days discussing the transformative potential of artificial intelligence. Now he was being asked to account for its costs—not in dollars, but in water and watts.

His response was characteristically direct. Water concerns? “Completely untrue, totally insane.” Energy consumption? A fair point, but one he quickly reframed. The real comparison, Altman argued, isn’t between AI systems and machines—it’s between AI and humans.

“It takes like 20 years of life, and all the food you eat before that time, before you get smart. The fair comparison is if you ask ChatGPT a question, how much energy does it take once a model is trained to answer that question, versus a human.” — Sam Altman, OpenAI CEO

The $650 Billion Question

Altman’s comments came at a pivotal moment. Just days earlier, Bridgewater Associates had released a report estimating that U.S. tech giants—Alphabet, Amazon, Meta, and Microsoft—will collectively invest approximately $650 billion in AI infrastructure during 2026 alone. The figure is staggering, even by the standards of an industry accustomed to spending at scale.

To put that number in context: it exceeds the GDP of all but 20 countries. It dwarfs the Apollo program’s inflation-adjusted cost of roughly $200 billion. It represents more than triple what the world’s data centers consumed in electricity in 2023—a level that already matched the annual consumption of Germany or France, according to International Monetary Fund estimates.

The spending isn’t abstract. It manifests as sprawling data center complexes across the American Southwest, massive chip fabrication facilities in Taiwan and Arizona, and an unprecedented demand for electricity that has utility companies scrambling to add capacity. Microsoft has committed $80 billion to AI infrastructure for 2025. Meta plans to spend up to $65 billion. The numbers keep climbing.

The Water Controversy

Altman’s dismissal of water concerns as “fake” touched a nerve. The claim that ChatGPT uses gallons of water per query may indeed be exaggerated—OpenAI has consistently denied this specific figure—but the broader issue of data center water consumption is well-documented.

Cooling requirements for AI infrastructure are immense. Traditional data centers use water to dissipate heat from electrical components, and while newer facilities increasingly employ air-cooling or closed-loop systems, the aggregate demand continues to grow. A January report from Xylem and Global Water Intelligence projected that water consumption for data center cooling will more than triple over the next 25 years as computing demand accelerates.

Local impacts are already visible. In Arizona’s desert communities, where several major data centers have been proposed or built, residents have raised concerns about aquifer depletion. Similar tensions have emerged in the Netherlands, Singapore, and other regions where water scarcity intersects with digital infrastructure expansion.

“I do not want to see a world where we equate a piece of technology to a human being.” — Sridhar Vembu, Co-founder of Zoho Corporation

The Human Comparison

Altman’s analogy between training AI models and raising humans drew immediate criticism. Sridhar Vembu, the billionaire co-founder of Indian software company Zoho Corporation, was present at the summit and pushed back on X. The equivalence, he suggested, represents a fundamental category error.

The debate highlights a deeper tension in how we evaluate AI’s costs and benefits. Altman’s framing—comparing inference energy to human cognitive effort—has some technical merit. Running a trained model to generate a response requires far less energy than the initial training process. On a per-query basis, modern AI systems can indeed be surprisingly efficient.

But critics argue this framing obscures more than it reveals. Human beings are multi-purpose systems. A person trained for one task brings capabilities that transfer across domains. They participate in communities, raise families, create art, and maintain the social fabric that makes civilization possible. AI systems, however impressive, do none of these things.

Moreover, the scale is different in kind, not just degree. When Altman notes that “the world is using so much AI,” he’s acknowledging a transformation in how humanity processes information. Billions of queries per day, each requiring computation at data centers powered by electricity generated somewhere, add up to an energy demand that strains existing infrastructure.

The Infrastructure Crunch

The resource debate isn’t theoretical for communities living near proposed data centers. Last week, the San Marcos City Council in Texas voted down a $1.5 billion data center project after months of public opposition. Residents cited concerns about strain on the electrical grid, potential increases in utility costs, and the opportunity cost of dedicating limited water resources to cooling servers rather than serving human needs.

Similar conflicts are playing out across the United States and internationally. In Northern Virginia, the world’s largest concentration of data centers, local officials have begun pushing back against further expansion. In Ireland, planning restrictions have been imposed to limit data center construction amid concerns about grid stability.

Tech companies have responded with commitments to renewable energy and, increasingly, nuclear power. Altman himself has invested in nuclear fusion startups and has been vocal about the need for dramatically expanded clean energy production. But building power plants—whether wind, solar, or nuclear—takes years, even decades. The AI infrastructure buildout is happening now.

The Efficiency Paradox

There’s a paradox at the heart of AI resource consumption: as systems become more efficient, we use them more, often negating the gains. This pattern—known as the Jevons paradox in economics—has played out across technological history. Steam engines became more efficient and were deployed more widely. LEDs reduced lighting costs and we illuminated more spaces.

AI appears to be following the same trajectory. More efficient models enable new applications, which drive more usage, which requires more infrastructure. The $650 billion that Bridgewater predicts for 2026 may be just the beginning. Some analysts project AI-related spending could reach $2.5 trillion annually within the decade.

Altman’s comparison to human development, whatever its merits, points to a real question: what are we building toward? If the goal is to create systems that augment human capabilities while consuming fewer resources than the equivalent human effort, the math might eventually work. But that requires sustained improvements in efficiency, a transition to clean energy, and careful attention to the externalities that don’t appear on balance sheets.

For now, the resource wars continue. Every data center proposal sparks local debate. Every drought season raises questions about water allocation. And every AI leader who dismisses these concerns as “fake” risks fueling the backlash that could slow the very transformation they’re trying to accelerate.


This article was reported by the ArtificialDaily editorial team. For more information, visit CNBC and Reuters.

By Arthur

Leave a Reply

Your email address will not be published. Required fields are marked *