Actions of NVIDIA It closed up 2.3% to a record high of $504.20 on Monday. The record comes ahead of the company’s fiscal third-quarter results on Tuesday, when analysts expect to see revenue growth of more than 170%.
If that’s not surprising enough, the company’s fiscal fourth-quarter forecast, as estimated by LSEG, is likely to show an even higher number: nearly 200% growth.
Heading into the Thanksgiving holiday, Wall Street will take a closer look at the company that has been at the center of this year’s artificial intelligence boom.
Nvidia’s share price has soared 237% in 2023, far outpacing any other member of the S&P 500. Its market cap now sits at $1.2 trillion, well above Goal either tesla. Any indication in the earnings call that enthusiasm for generative AI is cooling or that some big customers are moving to amd processors, or that China’s restrictions are having a detrimental effect on the business could spell trouble for a stock that has been on such a rise.
“Expectations are high ahead of NVDA’s Q24 earnings call on Nov. 21,” Bank of America analysts wrote in a report last week. They have a buy rating on the stock and said they “expect improvement or upside.”
However, they pointed to China’s restrictions and competitive concerns as two issues that will capture investors’ attention. In particular, AMD’s emergence into the generative AI market presents a new dynamic for Nvidia, which has primarily had the AI graphics processing unit (GPU) market to itself.
AMD CEO Lisa Su said late last month that the company expects GPU revenue of around $400 million during the fourth quarter and more than $2 billion in 2024. The company said in June that the MI300X, its most advanced GPU for AI, would begin shipping. to some clients this year.
Nvidia is still by far the market leader in AI GPUs, but high prices are an issue.
“NVDA needs to aggressively counter the narrative that its products are too expensive for generative AI inference,” the Bank of America analysts wrote.
Last week, Nvidia unveiled the H200, a GPU designed to train and deploy the types of AI models that are driving the explosion of generative AI, allowing businesses to build smarter chatbots and turn simple text into creative graphic designs.
The new GPU is an upgrade to the H100, the chip that OpenAI used to train its most advanced large language model, GPT-4 Turbo. H100 chips cost between $25,000 and $40,000, according to an estimate by Raymond James, and it takes thousands of them working together to create the largest models in a process called “training.”
The H100 chips are part of Nvidia’s data center group, whose revenue in the fiscal second quarter increased 171% to $10.32 billion. That represented about three-quarters of Nvidia’s total revenue.
For the fiscal third quarter, analysts expect data center growth to nearly quadruple to $13.02 billion from $3.83 billion a year earlier, according to FactSet. Total revenue is projected to rise 172% to $16.2 billion, according to analysts surveyed by LSEG, formerly Refinitiv.
According to current estimates, growth will peak in the fiscal fourth quarter at around 195%, LSEG estimates show. Expansion will remain strong throughout 2024, but is expected to slow each quarter of the year.
Executives can expect to answer questions on the earnings conference call related to the massive restructuring at OpenAI, the maker of the ChatGPT chatbot, which was a major catalyst for Nvidia’s growth this year. On Friday, OpenAI’s board of directors announced the sudden dismissal of CEO Sam Altman over disputes over the speed of the company’s product development and where he is focusing his efforts.
OpenAI is a big buyer of Nvidia GPUs, as is microsoft, the main sponsor of OpenAI. After a chaotic weekend, OpenAI said Sunday night that former Twitch CEO Emmett Shear would lead the company on an interim basis, and shortly after, Microsoft CEO Satya Nadella said Altman and The ousted president of OpenAI, Greg Brockman, would join together to lead a new advanced AI. research team.
So far, Nvidia investors have ignored China-related concerns despite their potential importance to the company’s business. The H100 and A100 AI chips were the first to be affected by new US restrictions last year that were aimed at curbing sales to China. Nvidia said in September 2022 that the US government would still allow it to develop the H100 in China, which accounts for 20% to 25% of its data center business.
The company has reportedly found a way to continue selling in the world’s second-largest economy while still complying with U.S. regulations. The company is ready to deliver three new chips, based on the H100, to Chinese manufacturers, Chinese financial outlet Cailian Press reported last week, citing sources.
Nvidia has historically avoided providing annual guidance, preferring to look ahead to the next quarter. But given how much money investors have poured into the company this year and how little they have left to follow this week, they will be listening closely to CEO Jensen Huang’s tone on the conference call for any signs that the A.I. generative may be disappearing.
LOOK: EMJ’s Eric Jackson expects a good report from Nvidia