Welcome to SmBom Tesco!
All Products

NVIDIA B200: Price Up to $40,000, 85% Profit Margin

2024-03-20
twitter photos
facebook photos
instagram photos

NVIDIA B200: Price Up to $40,000, 85% Profit Margin

According to reports, NVIDIA plans to sell its new Blackwell GPU B200, designed for AI and HPC workloads, at a price ranging from $30,000 to $40,000. However, this price range is tentative as NVIDIA is more inclined towards selling comprehensive solutions targeting data centers, rather than just the chips or accelerator cards themselves.

During yesterday's NVIDIA GTC 2024 conference, NVIDIA officially unveiled its new generation AI accelerator chip, the B200based on TSMC's N4P process technology. With a transistor count of 208 billion, it boasts more than twice the number of transistors found in the H100 chip, which had 80 billion transistors. Additionally, it is equipped with 192 GB of HBM3E memory, far surpassing the H100 chip's 80 GB of HBM memory. Its AI computational performance reaches 20 petaflops on both FP8 and the new FP6, making it 2.5 times faster than the previous generation H100 with 8 petaflops of computing power. On the new FP4 format, it can achieve up to 40 petaflops, representing a fivefold increase over the previous generation Hopper architecture GPU with 8 petaflops of computing power, making it the fastest AI chip globally.

However, the performance improvement of the B200 chip mainly stems from the increase in the number of transistors (i.e., chip area), with the architectural improvements themselves relatively limited. Moreover, the inclusion of the expensive 192 GB HBM3E memory will significantly raise the hardware cost. If NVIDIA plans to sell the B200 at $30,000 to $40,000, it would imply a gross profit margin of 80%-85%.

Of course, this calculation does not account for NVIDIA's R&D costs for the B200. Reports suggest that NVIDIA has already invested up to $10 billion in developing platforms such as the B100 and GB200 chips, breaking its financial records. According to reports, Jensen Huang stated that the company's expenditure on modern GPU architecture and design has exceeded $10 billion.

It's worth noting that NVIDIA is not inclined to sell B200 modules or accelerator cards. Instead, it might prefer to sell DGX B200 servers with eight Blackwell GPUs, GB200 NVL72 servers, or even DGX B200 SuperPODs with 576 B200 GPUs, each priced in the millions of dollars range.

During interviews, Jensen Huang emphasized that NVIDIA is more interested in selling supercomputers or DGX B200 SuperPODs, which come with a plethora of hardware and software at high prices. Consequently, NVIDIA has not listed B200 modules or accelerator cards on its website, only showcasing DGB B200 systems and DGX B200 SuperPODs. In other words, NVIDIA remains reserved regarding pricing information for its B200 GPU. The company expects the new products to ship later in 2024, with mass availability not until 2025.

 

Solemnly declare: The copyright of this article belongs to the original author. The reprinted article is only for the purpose of disseminating more information. If the author's information is marked incorrectly, please contact us to modify or delete it as soon as possible. Thank you for your attention!