Nvidia CEO Jen-Hsun Huang at the Consumer Electronics Show (CES) in Las Vegas, USA, 04 January 2017. Huang announced that his company will work with German car giant Audi in the future.
Andrej Sokolow | Picture alliance | Good pictures
Nvidia is on a tear and has no expiration date.
Nvidia makes the graphics processors, or GPUs, needed to build AI applications like ChatGPT. In particular, its most advanced AI chip, the H100, is in strong demand among tech companies.
Nvidia’s overall sales rose 171% year over year to $13.51 billion in the fiscal second quarter ended July 30, the company announced Wednesday. Not only does it sell a suite of AI chips, but they’re also highly profitable: the company’s gross margin grew more than 25 percentage points to 71.2% in the same quarter last year — incredible for a physical product.
In addition, Nvidia said it sees strong demand through next year and will ramp up supply, helping to increase the number of chips it can sell in the coming months.
The company’s stock jumped more than 6% hours after the news, adding to its more than 200% gain so far this year.
It’s clear from Wednesday’s report that Nvidia stands to gain more from the AI boom than any other company.
Nvidia posted an incredible $6.7 billion in net income for the quarter, up 422% from the same period last year.
“I think I’m on the street for this report next year, but my numbers should be higher,” Chaim Siegel, an analyst at Elazar Advisors, wrote in a note after the report. He raised his price target to $1,600, calling it “a 3x move from here,” adding, “I still think my numbers are very conservative.”
The price suggests a 2024 multiple of 13 times earnings per share, he said.
Nvidia’s massive cash flow contrasts with its high-end customers, which are spending heavily on AI hardware and building multimillion-dollar AI models but have yet to start seeing returns from the technology.
About half of Nvidia’s data center revenue comes from cloud providers, followed by large Internet companies. Growth in Nvidia’s data center business was in “computing,” or AI chips, which grew 195% in the quarter, outpacing the 171% growth of the overall business.
Microsoft, a big customer of Nvidia’s H100 GPUs, is ramping up its capital spending to build its AI servers, both for its Azure cloud and its partnership with OpenAI, and doesn’t expect a positive “revenue signal” until next year. Year.
In the consumer Internet, data centers are expected to spend $30 billion this year, with even more next year due to work on AI, according to Meta. Nvidia said Wednesday that it sees Meta returns in the form of increased engagement.
Some startups have gone into debt to buy Nvidia GPUs, hoping to rent them out for a profit in the coming months.
In an earnings call with analysts, Nvidia officials offered some perspective on why its data center chips are so profitable.
Nvidia said its software contributes to its margin and that it sells more complex products than just silicon. Nvidia’s AI software, called Cuda, is cited by analysts as the primary reason why customers are unable to easily switch to competitors like AMD.
“Our data center products include a significant amount of software and complexity, which also helps gross margins,” Nvidia Chief Financial Officer Colette Kress said on a call with analysts.
Nvidia bundles its technology into expensive and complex systems like its HGX box, which bundles eight H100 GPUs into a single computer. Nvidia boasted on Wednesday that building one of these boxes uses a supply chain of 35,000 parts. HGX boxes cost around $299,999, According to reportsAccording to a recent Raymond James estimate, the price ranges from $25,000 to $30,000 per H100.
Since Nvidia ships its native H100 GPU to cloud service providers, they often opt for a complete system.
“We call it the H100, like it’s a chip coming out of a fab, but the H100s go out, actually, as the HGX for the world’s hyperscalers, and they’re really very large computing components,” Nvidia CEO Jensen Huang said on a call. with researchers.