Nvidia shares rose near the $1 trillion market cap in after-hours trading on Wednesday reported shockingly strong outlook for the future, and CEO Jensen Huang said the company would have a “huge record year.”
Sales are rising due to growing demand for graphics processing units (GPUs) made by Nvidia that power AI applications such as those from Google, Microsoft and OpenAI.
Demand for AI chips in data centers pushed Nvidia to post $11 billion in revenue in the current quarter, blowing away analysts’ estimates of $7.15 billion.
“The flashpoint was generative AI,” Huang said in an interview with CNBC. “We know CPU scaling has slowed down, we know accelerated computing is the way forward, and then the killer app comes along.”
Nvidia believes this is a significant shift in the way PCs are made that could lead to even more growth — data center parts could even become a $1 trillion market, Huang says.
Historically, the central processing unit, or CPU, was the most important part of a computer or server. He dominated this market Intelwith AMD as his main rival.
With the advent of AI applications that require many When it comes to computing power, the graphics processing unit (GPU) takes center stage, and the most advanced systems use up to eight GPUs per CPU. Currently Nvidia dominates the AI GPU market.
“The data center in the past, which was largely CPU for loading files, will be generative data in the future,” Huang said. “Instead of loading data, you’re going to get some data, but you have to generate most of the data with AI.”
“So instead of millions of CPUs, you’ll have a lot less CPUs, but they’ll be connected to millions of GPUs,” Huang continued.
For example from Nvidia own DGX systemswhich are essentially an AI training computer in a box, use eight of Nvidia’s top-of-the-line H100 GPUs and only two CPUs.
Google A3 supercomputer it combines eight H100 GPUs together with one high-end Xeon processor from Intel.
That’s one reason why Nvidia’s data center business grew 14% during the first calendar quarter, versus flat growth in AMD’s data center unit and a 39% decline in Intel’s AI and data center business unit.
Additionally, Nvidia GPUs tend to be more expensive than many CPUs. The latest generation of Intel Xeon processors can cost up to $17,000 at list price. A single Nvidia H100 can sell for $40,000 secondary market.
Nvidia will face increased competition as the market for AI chips heats up. AMD has a competitive GPU business, especially in gaming, and Intel also has its own line of GPUs. Startups are building new kinds of chips specifically for AI and mobile-focused companies like Qualcomm and Apple is constantly pushing the technology so that one day it can work in your pocket, not in a giant server farm. Google and Amazon are designing their own AI chips.
But Nvidia’s high-end GPUs remain chip of your choice it’s expensive for current companies building apps like ChatGPT, which are expensive to train to process terabytes of data and later run in a process called “inference” that uses a model to generate text, images, or predictions.
Analysts say Nvidia remains in the lead for AI chips because of its proprietary software making it easy to use all GPU hardware features for AI applications.
Huang said Wednesday that it won’t be easy to replicate the company’s software.
“You have to design all the software and all the libraries and all the algorithms, integrate them into and optimize the frameworks, and optimize them for the architecture, not just for one chip, but for the architecture of the entire data center,” Huang said on a call with analysts.