So, Sam Altman, the CEO of OpenAI, hopped over to India for a big AI summit and ended up addressing something pretty important: the environmental impact of AI. You know, stuff like energy and water usage. Apparently, there's been a lot of chatter about how much water ChatGPT guzzles, and Altman's having none of it.

He straight-up called the water usage concerns "totally fake." I mean, he did admit that it was a valid point back when they were using evaporative cooling in data centers. But according to him, those days are long gone. He's not happy with the internet claims saying that a single ChatGPT query uses like, 17 gallons of water. "This is completely untrue, totally insane, no connection to reality," he said. Strong words, right?

However, Altman did concede that worrying about energy consumption is "fair," especially now that the world is so hooked on AI. His solution? We need to speed up our transition to nuclear, wind, and solar energy. And I kind of agree with him. It's not just about AI; we need cleaner energy sources across the board.

Now, here's where it gets interesting. The interviewer brought up a conversation with Bill Gates, asking if it's true that a single ChatGPT query uses the energy equivalent of 1.5 iPhone battery charges. Altman's response? "There’s no way it’s anything close to that much.” So, who are we supposed to believe?

He also thinks it's "unfair" when people focus on the energy it takes to train an AI model compared to how much energy a human uses to answer a question. It's like comparing apples and oranges, you know?

Because, in his words, "It also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart." He even threw in the evolution of billions of people who learned not to get eaten by predators. Okay, that's a bit of a stretch, but I get his point.

Altman argues that the fair comparison is how much energy ChatGPT uses to answer a question after it's trained versus how much energy a human uses. And he believes AI might already be more energy-efficient in that scenario. I think he does have a point, but it's definitely something that needs more investigation.