Nvidia no sign of AI slowdown after over 400% jump in data center unit

Nvidia‘s historic rally is being driven by its data center business, which grew at a whopping 427% in the latest quarter as companies keep snapping up its artificial intelligence processors.

Now Nvidia is signaling to investors that the customers spending billions of dollars on its chips will be able to make money off AI, too. It’s a concern that’s been swirling around the company because there’s only so much cash clients can burn on infrastructure before they need to see some profit.

If Nvidia’s chips can provide a strong and sustainable return on investment, that suggests the AI boom may have room to run as it moves past the early stages of development, and as companies plan for longer-term projects.

Nvidia’s most important clients for its graphics processing units (GPUs) are the big cloud providers — Amazon Web Services, Microsoft Azure, Google Cloud, and Oracle Cloud. They comprised “mid-40%” of Nvidia’s $22.56 billion in data center sales in the April quarter, the company said.

There’s also a newer crop of specialized GPU data center startups that buy Nvidia’s GPUs, install them in server racks, load them up in data centers, connect them to the internet, and then rent them out to customers by the hour.

For example, CoreWeave, a GPU cloud, is currently quoting $4.25 per hour to rent an Nvidia H100. This kind of server time is essential in large quantities to train an LLM such as OpenAI’s GPT, and it’s how many AI developers end up accessing Nvidia hardware.

Following Nvidia’s better-than-expected earnings report on Wednesday, finance chief Colette Kress told investors that cloud providers were seeing an “immediate and strong return” on investment. She said that if a cloud provider spends $1 on Nvidia hardware, it can rent it out for $5 over the next four years.

Kress also said newer Nvidia hardware would have an even stronger return on investment, citing the company’s HDX H200 product, which combines 8 GPUs, providing access to Meta’s Llama AI model, instead of raw access to a cloud computer.

“That means for every $1 spent on NVIDIA HDX H200 servers at current prices, an API provider serving Llama 3 tokens can generate $7 in revenue over four years,” Kress said.

Part of the calculation includes how much the chips are utilized. Are they running 24 hours a day or a lower proportion of time?

Nvidia CEO Jensen Huang told analyst on the earnings call that OpenAI, Google, Anthropic, and as many as 20,000 generative AI startups are lining up for every GPU the cloud providers can put online.

“All of the work that’s being done at all the [cloud service providers] are consuming every GPU that’s out there,” Huang said.

“Customers are putting a lot of pressure on us to deliver the systems and stand it up as quickly as possible,” he continued.

Huang said Meta has declared its intention to spend billions of dollars on 350,000 Nvidia chips, even though the company isn’t a cloud provider. Meta will likely have to monetize its investment through its advertising business or by including a chatbot inside its current apps.

Meta’s cluster of servers is an example of “essential infrastructure for AI production,” Huang said, or, “what we refer to as AI factories.”

Nvidia also surprised analysts by giving an aggressive timeline for its next-generation GPU, called Blackwell, which will be available in data centers in the fiscal fourth quarter. Those comments allayed fears of a slowdown as companies wait for the latest technology.

The first customers for the new chips include Amazon, Google, Meta, Microsoft, OpenAI, Oracle, Tesla, and Elon Musk’s xAI, Huang said.

Nvidia shares jumped 6% in extended trading, surpassing $1,000 for the first time. In addition to announcing earnings, Nvidia announced a 10-for-1 stock split following a 25-fold surge in the company’s share price over the past five years.

WATCH: Analysts on Nvidia

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Secular Times is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – seculartimes.com. The content will be deleted within 24 hours.

Leave a Comment