From www.tomshardware.com

Nvidia GH200 SC23 Announcement

(Image credit: Nvidia)

Elon Musk, CEO of Tesla and founder of xAI, made some bold predictions about the development of artificial general intelligence (AGI) and discussed the challenges facing the AI industry. He predicts that AGI could surpass human intelligence as soon as next year or by 2026, but that it will take an extreme number of processors to train, which in turn requires huge amounts of electricity, reports Reuters.

Musk’s venture, xAI, is currently training the second version of its Grok large language mode and expects to complete its next training phase by May. The training of Grok’s version 2 model required as many as 20,000 Nvidia H100 GPUs, and Musk anticipates that future iterations will demand even greater resources, with the Grok 3 model needing around 100,000 Nvidia H100 chips to train.

The advancement of AI technology, according to Musk, is currently hampered by two main factors: supply shortages on advanced processors — like Nvidia’s H100, as it’s not easy to get 100,000 of them quickly — and the availability of electricity.

Nvidia’s H100 GPU consumes around 700W when fully utilized, and thus 100,000 GPUs for AI and HPC workloads could consume a whopping 70 megawatts of power. Since these GPUs need servers and cooling to operate, it’s safe to say that a datacenter with 100,000 Nvidia H100 processors will consume around 100 megawatts of power. That’s comparable to the power consumption of a small city.

Musk stressed that while the compute GPU supply has been a significant obstacle so far, the supply of electricity will become increasingly critical in the next year or two. This dual constraint underscores the challenges of scaling AI technologies to meet growing computational demands.

Despite the challenges, advancements in compute and memory architectures will enable the training of increasingly massive large language models (LLMs) in the coming years. Nvidia revealed its Blackwell B200 at GTC 2024, a GPU architecture and platform that’s designed to scale to LLMs with trillions of parameters. This will play a critical role in development of AGI.

In fact, Musk believes than an artificial intelligence smarter than the smartest human will emerge in the next year or two. “If you define AGI as smarter than the smartest human, I think it is probably next year, within two years,” Musk said in an interview on X Spaces. That means it’s apparently time to go watch Terminator again, and hope that our future AGI overlords will be nicer than Skynet. ☺

Join the experts who read Tom’s Hardware for the inside track on enthusiast PC tech news — and have for over 25 years. We’ll send breaking news and in-depth reviews of CPUs, GPUs, AI, maker hardware and more straight to your inbox.

Anton Shilov is a Freelance News Writer at Tom’s Hardware US. Over the past couple of decades, he has covered everything from CPUs and GPUs to supercomputers and from modern process technologies and latest fab tools to high-tech industry trends.

[ For more curated Computing news, check out the main news page here]

The post Elon Musk says the next-generation Grok 3 model will require 100,000 Nvidia H100 GPUs to train | Tom’s Hardware first appeared on www.tomshardware.com

New reasons to get excited everyday.



Get the latest tech news delivered right in your mailbox

You may also like

Subscribe
Notify of
0 Comments
Inline Feedbacks
View all comments

More in computing