Nvidia Develops AI GPU Featuring 144GB HBM3E Memory

TrendForce reports that Nvidia is gearing up to release its next generation of B100 and B200 GPUs, featuring 144GB HBM3E memory, built on the new Blackwell architecture.

Because TSMC’s CoWoS-L packaging, used for the B200 series, is limited, the B200A will use the easier CoWoS-S packaging. The B200A will have 144GB of HBM3E memory, down from 192GB, and each chip will have four layers instead of eight. However, each memory chip’s capacity will increase from 24GB to 36GB.

ms x dost

The B200A will use less power and will be cooled by air, not liquid, making it easier to set up. Nvidia plans to provide the B200A to manufacturers by the second quarter of next year.

nvidia develops gpu featuring 144gb hbm3e memory

In 2024, Nvidia will focus on high-end GPUs using the Hopper platform, with H100 and H200 for North America and H20 for China. The B200A, available in the second quarter of 2025, will not clash with the H200, which will come out in the third quarter.

Nvidia 144GB HBM3E Memory

B200A variant

Scheduled for market release in the second half of this year, these GPUs are designed for Cloud Service Providers (CSPs). Additionally, Nvidia will introduce a streamlined B200A variant tailored for OEM enterprises focusing on edge AI applications.

    Leave a Reply

    Your email address will not be published. Required fields are marked *

    This site uses Akismet to reduce spam. Learn how your comment data is processed.

    ManilaShaker is a tech media producing insightful and helpful content for our local and growing international audience. Our goal is to create a premier Philippine digital consumer electronics resource that provides the most objective reviews and comparisons globally.