Welcome to SmBom Tesco!
All Products

Meta to Lead with Latest NVIDIA AI Chips

twitter photos
facebook photos
instagram photos

Meta to Lead with Latest NVIDIA AI Chips

According to reports from reputable sources, NVIDIA's highly anticipated latest flagship artificial intelligence chips are expected to be available later this year, marking the commencement of shipments from the esteemed tech company.

Renowned for its technological prowess, NVIDIA plays a pivotal role in powering cutting-edge artificial intelligence endeavors. At its recent annual developer conference, the company unveiled the B200 "Blackwell" chip, boasting an impressive 30-fold increase in speed for tasks such as chatbot interactions.

Speaking to financial analysts, NVIDIA's Chief Financial Officer, Colette Kress, confirmed plans for a launch later this year, with an anticipated increase in shipment volume not expected until 2025.

Among NVIDIA's key clients is Meta, a social media giant, which has previously procured hundreds of thousands of NVIDIA's prior-generation chips. Meta's CEO, Mark Zuckerberg, disclosed earlier plans to amass approximately 350,000 early-stage chips (referred to as H100) by year-end. Recent statements from a Meta spokesperson confirm the company's receipt of NVIDIA's latest artificial intelligence chips later this year, marking the initial wave of shipments.

Mark Zuckerberg also revealed Meta's intention to employ the Blackwell chip in training the company's Llama model. Meta is currently utilizing two GPU clusters, announced last week, to train its third-generation model, each comprising approximately 24,000 H100 GPUs.

In continuation, Meta intends to leverage these clusters for the training of Llama 3, with plans to incorporate Blackwell into future iterations of the model.


Solemnly declare: The copyright of this article belongs to the original author. The reprinted article is only for the purpose of disseminating more information. If the author's information is marked incorrectly, please contact us to modify or delete it as soon as possible. Thank you for your attention!