When OpenAI CEO Sam Altman took the stage at The Indian Express event this week, he knew the questions were coming. As artificial intelligence scales to billions of users worldwide, the environmental cost of all that computation has become impossible to ignore. Altman came prepared with a defense that few in the audience expected: a philosophical comparison between training an AI model and raising a human being. “It also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart.” — Sam Altman, CEO of OpenAI Dismissing the ’17 Gallons Per Query’ Myth Altman didn’t mince words when addressing one of the most persistent claims about AI’s environmental impact. “You see these things on the internet where, ‘Don’t use ChatGPT, it’s 17 gallons of water for each query’ or whatever,” he said. “This is completely untrue, totally insane, no connection to reality.” The OpenAI chief explained that such concerns stem from outdated practices. “It was a real issue when we used to do evaporative cooling in data centers,” Altman acknowledged. “Now that we don’t do that,” the water usage claims no longer hold water. The comments come as tech companies face increasing scrutiny over their environmental footprints. Unlike many industries, there’s no legal requirement for AI companies to disclose their energy and water consumption, leaving researchers to estimate the impact independently. The Energy Equation While dismissing water concerns as “totally fake,” Altman was more measured when discussing energy consumption. “It’s fair to be concerned about the energy consumption—not per query, but in total, because the world is now using so much AI,” he admitted. The scale challenge is undeniable. Data centers powering AI systems have been linked to rising electricity prices in some regions, and the computational demands of training and running large language models continue to grow. Altman’s proposed solution: “We need to move towards nuclear or wind and solar very quickly.” The comparison debate took an unexpected turn when an interviewer cited a previous conversation with Bill Gates, asking whether a single ChatGPT query uses the equivalent of 1.5 iPhone battery charges. Altman’s response was direct: “There’s no way it’s anything close to that much.” “If you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question versus a human? And probably, AI has already caught up on an energy efficiency basis, measured that way.” — Sam Altman Beyond the Kilowatt-Hour Altman’s most provocative argument extended beyond simple energy metrics to a broader philosophical question about the nature of intelligence and its costs. He complained that many discussions focus narrowly on “how much energy it takes to train an AI model, relative to how much it costs a human to do one inference query.” But in Altman’s view, that comparison misses the bigger picture. “Not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you,” he argued. The analogy—comparing the energy required to train a foundation model against the cumulative energy of human evolution and education—may strike some as convenient rhetorical maneuvering. But it underscores a genuine tension in how society measures the costs and benefits of artificial intelligence. The Unanswered Questions Altman’s comments come at a pivotal moment for the AI industry. As models grow larger and more capable, their computational requirements continue to expand. The environmental impact of AI isn’t just a public relations challenge—it’s becoming a genuine constraint on where and how data centers can be built. Some tech companies are already making massive bets on clean energy. Microsoft has signed agreements to power its data centers with renewable energy, while Google has committed to operating on 24/7 carbon-free energy by 2030. OpenAI itself has invested in nuclear fusion startup Helion Energy, though commercial fusion remains years away. The debate over AI’s environmental footprint isn’t going away. As billions more users come online and AI becomes embedded in everything from search engines to creative tools, the energy demands will only grow. Whether Altman’s philosophical defense resonates with policymakers and the public—or whether it will be seen as special pleading from an industry insider—remains to be seen. This article was reported by the ArtificialDaily editorial team. For more information, visit TechCrunch. Related posts: Introducing Lockdown Mode and Elevated Risk labels in ChatGPT Custom Kernels for All from Codex and Claude IBM and UC Berkeley Diagnose Why Enterprise Agents Fail Using IT-Bench Reliance unveils $110B AI investment plan as India ramps up tech ambit Post navigation 6 days left to lock in the lowest TechCrunch Disrupt 2026 rates The robots who predict the future