One of many lesser-discussed impacts of the AI push is the sheer quantity of vitality required to energy the plenty of systematic infrastructure required to energy these expansive techniques.
In line with reviews, the training course of for OpenAI’s GPT-4, which is powered by round 25,000 NVIDIA A100 GPUs, required as much as 62,000 megawatt-hours. That’s equal to the vitality wants of 1,000 U.S. households for over 5 years.
And that’s only one challenge. Meta’s new AI supercluster will embrace 350,000 NVIDIA H100 GPUs, whereas X and Google, amongst numerous others, are additionally constructing huge {hardware} initiatives to energy their very own fashions.
It’s an enormous useful resource burden, which would require important funding to facilitate.
And it’ll even have an environmental affect.
To supply some perspective on this, the crew at Visible Capitalist have put collectively an overview of Microsoft’s rising electrical energy wants because it continues to work with OpenAI on its initiatives.