Connect with us

Tech

Meta Expects NVIDIA Chips To Begin Shipping Later This Year

Avatar of AlishbaW

Published

on

Meta Expects NVIDIA Chips To Begin Shipping Later This Year

(CTN News) – Facebook’s owner, Meta Platforms, expects to receive first shipments of NVIDIA’s upcoming flagship artificial intelligence chip in the second half of this year, according to a Meta spokesperson.

This week at Nvidia’s annual developer conference, the company announced a new B200 “Blackwell” chip, the latest version of its popular GPU (graphics processing unit) chips that power most cutting-edge artificial intelligence projects.

In Nvidia’s claim, the B200 chip is 30 times faster at handling tasks such as serving up answers from chatbots, but it did not specify how well it performs when chewing through huge amounts of data to train them. This is the type of work which has led to most of Nvidia’s soaring sales.

The Chief Financial Officer of Nvidia, Colette Kress, stated to financial analysts on Tuesday that “we expect to be able to bring the new GPUs to market later this year,” however she went on to point out that the shipment volume would not ramp up until 2025 for the new GPUs.

As one of Nvidia’s biggest customers, Meta is one of the company’s biggest users, having bought hundreds of thousands of its previous generation of chips to support its push into faster recommendations systems and artificial intelligence products that are driven by machine learning.

In January of this year, Meta’s CEO Mark Zuckerberg disclosed that the company plans to have more than 350,000 of those earlier chips, which are called H100s, in its stockpile by the end of the year. By then, he added, Meta would have the equivalent of about 600,000 H100s in combination with other GPUs, thereby equating to about the same amount of H100s.

It was announced by Zuckerberg on Monday that Meta would use Blackwell to train the firm’s Llama models  according to a statement he released. There’s a third generation of the company’s model being trained on two GPU clusters that it announced last week, which contain around 24,000 H100 GPUs each, and they’re currently working on a third generation of the model.

Despite the popularity of those clusters, Meta intends to continue using them for training Llama 3 and to use Blackwell for future generations of the model, according to a spokesperson for the company.

SEE ALSO:

Spotify Paid Royalties Of $9 Billion In 2023. What’s Driving Growth?

Continue Reading

CTN News App

CTN News App

Recent News

BUY FC 24 COINS

compras monedas fc 24

Volunteering at Soi Dog

Find a Job

Jooble jobs

Free ibomma Movies