OpenAI CEO Sam Altman recently compared the energy required to power artificial intelligence with the lifelong food consumption needed to train a human. Altman stated that it takes "20 years of life and all of the food you eat during that time before you get smart." He argued that when comparing a trained AI model answering a question to a trained human performing a similar task, AI has likely already achieved greater energy efficiency.Altman made these remarks during an interview in India in February 2026, addressing growing concerns about AI's environmental impact.[businesstoday+8]
Debating AI's True Energy Cost
Altman challenged common methods of assessing AI's energy footprint. He said it is unfair to compare the energy needed to train an AI model with the energy a human uses to answer a single question. Instead, he proposed comparing a fully trained AI model to a human who has already undergone years of development. "The fair comparison is if you ask ChatGPT a question, how much energy does it take once its model is trained to answer that question versus a human?" Altman said. He believes that AI has "probably already caught up on an energy efficiency basis, measured that way."[businesstoday+4]
He extended this analogy beyond individual humans, noting that human intelligence also relies on the "very widespread evolution of the hundred billion people that have ever lived" who learned to survive and advance knowledge.This long history contributed to the environment that produces intelligent humans. Altman's argument suggests a broader perspective is needed when discussing the energy investment in intelligence, whether artificial or biological.[businesstoday+2]
Dispelling Water Consumption Myths
Altman also directly addressed and dismissed widely circulated claims about ChatGPT's water consumption. He called estimates suggesting each ChatGPT query uses 17 gallons of water "totally fake," "completely untrue," and "insane."He explained that such figures are based on outdated data center cooling methods. Modern facilities no longer rely on water-intensive evaporative cooling, using advanced technologies to significantly reduce water usage.[businesstoday+7]
For example, current data centers use closed-loop liquid cooling and direct-to-chip cooling. They also use non-potable and reclaimed water more often.These improvements mean today's data centers deliver much more computation for less water and energy than older ones.[findarticles+1]
AI's Growing Total Power Demand
While Altman disputed exaggerated per-query figures, he acknowledged that the overall energy consumption of AI systems is a valid and serious concern. He stated that the world is now using so much AI, which is real, and this demands immediate action.Altman emphasized the need to "move towards nuclear or wind and solar very quickly" to meet the surging electricity demand from AI.[businesstoday+4]
He highlighted that the true challenge is the aggregate energy footprint across global AI systems, not individual query metrics.OpenAI's future plans underscore this massive demand. The company aims to build new AI data centers that will consume electricity on a scale comparable to major American cities.Some projects are designed to use up to 10 gigawatts of power, with another 17 gigawatts already in progress.This level of consumption is similar to the summer electricity needs of New York City and San Diego combined.Andrew Chien, a computer science professor at the University of Chicago, called these plans a "seminal moment," noting that computing could account for 10% or 12% of the world's power by 2030.[mexc+4]
Current AI Energy Use and Future Projections
OpenAI has publicly stated that an average ChatGPT query uses about 0.34 watt-hours of electricity.To put this in perspective, this amount of electricity is what a high-efficiency lightbulb uses in a couple of minutes or an oven uses in a little over one second.For comparison, a typical Google search uses about 0.03 watt-hours.Google reports that a median text prompt to its Gemini AI model consumes 0.24 watt-hours.[towardsdatascience+9]
These individual figures, though seemingly small, add up quickly due to the sheer volume of AI use. ChatGPT alone serves billions of queries daily.Experts predict that AI's energy consumption will continue to surge. Data centers, which house AI systems, already consume a significant portion of electricity. In the United States, data center energy use more than doubled between 2017 and 2023.By 2028, data centers could account for 6.7% to 12% of all U.S. electricity, a two to three-fold increase in just five years.Globally, AI-related computation could make up 3% to 5% of all electricity by 2030.[towardsdatascience+4]
Altman has previously stated that the "cost of AI will converge to the cost of energy" and that "a significant fraction of the power on Earth should be spent running AI compute."This highlights his view that energy availability will be the primary limiting factor for AI's growth.[techpolicy+3]
The Broader Discussion on AI and Energy
The discussion around AI's energy consumption is complex. Some experts argue that focusing solely on direct energy use misses the bigger picture. They suggest that AI can also lead to significant energy savings by optimizing various processes.For example, AI can improve efficiency in logistics, transportation, and industrial operations, potentially offsetting its own energy demands.[medium+3]
However, the rapid expansion of AI infrastructure is undeniable. The International Energy Agency estimates data centers used around 460 terawatt-hours of electricity in 2022 and could reach 620–1,050 terawatt-hours by 2026, with AI being a major driver. This growth is already straining power grids in some areas, leading to increased electricity prices.[findarticles]
The industry faces pressure to increase transparency regarding AI's energy and water use. As AI becomes more integral to daily life, a collective effort is needed to ensure its development aligns with environmental responsibility. This includes continued efficiency improvements, accurate reporting, and a rapid transition to clean energy sources.[tomshardware+1]




