Blackwell, the world’s most powerful GPU!

0

NVIDIA is taking advantage of the GTC to unveil its Blackwell GPUs, boasting features that are simply out of the ordinary. The B200 comes with a total of 208 billion transistors on two dies. It’s also NVIDIA’s first MCM GPU, while performance is x5 compared with the H100!

Blackwell B200: x5 performance compared with the H100!

NVIDIA Blackwell B200
Source: Tom’s Hardware.com

For the time being, NVIDIA’s GPU benefits from an MCM design, the whole being produced by TSMC. As for the engraving, we find a 4 nm fineness via the N4P process. This enables the integration of a total of 208 billion transistors, 104 billion per die.

These two chiplets are also interconnected via a link with a bandwidth of 10 TB/s. TechpowerUp reports that this is fast enough for each chiplet to address and access the other’s memory without any problems. Finally, for communication between each GPU, we have an NVLINK link offering a bandwidth of 1.8 TB/s.

The memory side of things is also scary, although it’s well known at last. Each chiplet benefits from 96 GB of HBM3e memory with a 4096-bit bus. This gives us a total of 192 GB of VRAM with a bandwidth of 8 TB/s.

On top of all this, we have sixth-generation Tensor Core with FP4 and FP6 support.

GB200: two B200s, one ARM Grace CPU!

NVIDIA then announced something even crazier with the GB200, a module featuring two B200 GPUs coupled with an ARM Grace processor with 72 Neoverse V2 cores, according to El Chapuzas Informatico.

NVIDIA Blackwell GB200
Source: Tom’s Hardware.com

Performance is therefore monstrous:

  • FP4: 20 PFLOP (Dense) – 40 PFLOP (Sparse)
  • FP6 – FP8: 10 PFLOP (Dense) – 20 PFLOP (Sparse)
  • INT8: 10 PFLOP (Dense) – 20 PFLOP (Sparse)
  • FP64 Tenso Dense: 90 TFLOP

Features are also doubled with modules featuring 384 GB memory, 16 TB/s bandwidth and 2x 1.8 TB/s NVLINK bandwidth.

But power consumption is also doubled, with a total of 2700W to dissipate. A B200 alone has a TDP of 1000W, compared with 700W for the B100… All that for AI.

As for availability, the first shipments should begin by the end of the year. Prices have not been announced, but it’s going to sting!