Nvidia CEO says his AI chips are faster than Moore's Law


Nvidia CEO Jensen Huang said the performance of his company's AI chips is advancing faster than the historical rates dictated by Moore's Law. It's said to be the result of decades of driving advances in computing.

“Our systems are evolving faster than Moore's Law,” Huang said in an interview with TechCrunch on Tuesday morning. Delivered a keynote to a crowd of 10,000 people at CES. In Las Vegas.

Co-founded in 1965 by Intel co-founder Gordon Moore, Moore's Law predicts that the number of transistors on computer chips will double every year, essentially doubling the performance of those chips. This prediction has largely been shattered, creating rapid advances in efficiency and falling costs over several decades.

Moore's Law has slowed down in recent years. However, Huang says Nvidia's AI chips are moving at their own pace; The company says its latest datacenter superchip can run AI inference workloads 30 times faster than its predecessor.

“We can build architecture, chip, system, libraries and algorithms simultaneously,” Huang said. “If you do that, you can go faster than Moore's Law, because you can innovate the whole stack.”

The moment came when Nvidia's CEO made a bold claim. Many are questioning whether the progress of AI has stalled.. Google Leading AI labs such as OpenAI and Anthropic use Nvidia's AI models to train and execute on Nvidia's AI chips, and improvements in these chips will likely translate to further improvements in AI modeling capabilities.

This isn't the first time Huang has suggested that Nvidia is overstepping Moore's Law. In a podcast in NovemberHuang suggested the AI ​​world is on pace for “hyper Moore's Law.”

Huang rejects the notion that AI progress is slowing down. It claims that there are three laws of AI scaling now active in the pre-training phase where AI models learn patterns from large amounts of data. Post-training, which fine-tunes the AI ​​model's answers using methods such as human feedback; and test time calculation; It occurs during the inference phase and gives the AI ​​model more time to “think” after each question.

“Moore's Law is very important in the history of computing,” Huang told TechCrunch. “It's the same with inference where we improve efficiency, and as a result the cost of inference will decrease.”

(Yes, there's Nvidia. It has become the most valuable company in the world. (By riding the AI ​​boom, Huang benefits from saying this.)

Nvidia CEO Jensen Huang using the gb200 nvl72 as a shield (Image: Nvidia)

Nvidia's H100s are the chip of choice for tech companies looking to train AI models, but tech companies are now focusing more on inference. Some question whether Nvidia's high-end chips will stay on top.

AI models that use test-time computation are expensive to run today. There is concern that OpenAI's o3 model, which scales the test-time computation version, will be too expensive for most people to use. For example, OpenAI spent nearly $20 per business using o3 to get human-level scores. On a general intelligence test. A ChatGPT Plus subscription costs $20 for a full month of use.

Huang held Nvidia's latest data center superchip, the GB200 NVL72, on stage like a shield during Monday's keynote speech. This chip is 30 to 40 times faster at performing AI inference tasks than Nvidia's previous best-selling chip, the H100. This performance jump means AI reasoning models like OpenAI's o3, which use a lot of computation during the inference stage, will become cheaper over time, Huang said.

Huang says the overall focus is on creating more efficient chips, and that more efficient chips create lower prices in the long run.

“A direct and immediate solution to computing during testing, both performance and cost-effectiveness, is to increase our computing capacity,” Huang told TechCrunch. In the long run, he noted, AI reasoning models could be used to generate better data for pre-training and post-training of AI models.

We've certainly seen the price of AI models drop in the last year, partly due to calculations by hardware companies like Nvidia. Although the first versions seen from OpenAI are very expensive, Huang said he expects AI reasoning models to continue.

More broadly, Huang said his AI chips today are 1,000 times better than they were 10 years ago. That's a much faster pace than the benchmark set by Moore's Law, and Huang says he doesn't see any signs of stopping anytime soon.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *