Part #/ Keyword
All Products

Intel Gaudi2 Shines in MLPerf Benchmark

2023-06-30 11:14:20Mr.Ming
twitter photos
twitter photos
twitter photos
Intel Gaudi2 Shines in MLPerf Benchmark

The AI computing market is heating up as major players like Intel and AMD join NVIDIA in releasing their own AI computing chips for large language model (LLM) training. Intel's Habana Gaudi2 chip has completed MLPerf benchmark tests and offers better cost-effectiveness than NVIDIA A100 (FP16) for LLM training. The chip is set to release software support for FP8 and new features in September.

According to a report from Wccftech, ChatGPT is one of the most groundbreaking applications in recent times, highlighting the growing trend of LLMs. MLCommons, an open engineering alliance, revealed the latest MLPerf Training 3.0 results, showcasing that the combination of Intel Xeon processors and the Intel Gaudi deep learning accelerator provides an alternative to NVIDIA GPUs, meeting market demands effectively.

Key highlights of the Habana Gaudi2 test results include:

· Impressive training time on GPT-31, taking only 311 minutes with 384 accelerators.

· Near-linear scaling, achieving up to 95% scalability from 256 to 384 accelerators on the GPT-3 model.

· Outstanding training results on computer vision models such as ResNet-50 (8 accelerators), Unet3D (8 accelerators), and natural language processing model BERT (8 and 64 accelerators).

· Improved performance on BERT and ResNet models, with a 10% and 4% increase, respectively, compared to previous results, indicating the software's maturity.

· Gaudi2 provides "plug-and-play" results, allowing customers to achieve similar performance without modifications when deploying it locally or in the cloud.

These advancements in AI computing chips offer the electronic components industry new opportunities to meet the growing demand for LLM training. They deliver improved efficiency and scalability across various applications, enhancing the accessibility and performance of AI technologies.

* Solemnly declare: The copyright of this article belongs to the original author. The reprinted article is only for the purpose of disseminating more information. If the author's information is marked incorrectly, please contact us to modify or delete it as soon as possible. Thank you for your attention!