Story Highlights
  • Nvidia has just revealed the Blackwell architecture.
  • It states that GB200 GPUs can be combined with a Grace CPU for up to 30 times more performance in LLM applications.
  • The corporation also boasts efficiency gains through this architecture.

Companies have been battling to catch up to Nvidia, which became a multitrillion-dollar corporation thanks to its indispensable H100 AI processor, with its valuation poised to surpass that of Alphabet and Amazon soon.

With data center GPUs that are in high demand, Nvidia is presently leading the AI field. Numerous of the most powerful supercomputers in the world are powered by its Hopper H100 and GH200 Grace Hopper Superchips, which are in high demand.

However, it has just unveiled Hopper’s replacement. Several hours ago, CEO Jensen Huang unveiled the Blackwell B200 AI GPU and next-generation data center chips that will enable a significant generational leap in computing capability.

Nvidia Blackwell Vs Hopper
Nvidia Blackwell Vs Hopper From CEO Jensen Huang

H100/H200 is replaced by the B200 GPU and the Blackwell architecture itself. A Grace Blackwell GB200 Superchip pairing this GPU with the Grace architecture will also be released alongside the new chips.

We expect that Nvidia will eventually provide Blackwell GPUs in the consumer class; however, those may not be available until 2025 and will differ significantly from the chips used in data centers.

According to Nvidia, a GB200 combining two GPUs with a Grace CPU can be far more efficient and provide a 30 times boost to LLM application workloads. The architecture also touts new levels of efficiency.

Comparing it to an H100, it is said to reduce cost and energy consumption by as much as 25x. According to CEO Jensen Huang, what would have needed 8,000 Hopper GPUs and 15 megawatts of power in the past can be done by just 2,000 Blackwell GPUs and 4 megawatts of power today.

This seems to have impressed various tech giants, with famous personalities praising this company as it leads the AI revolution.

There is nothing better than Nvidia hardware for AI.

-Elon Musk

Image via Nvidia GTC 2024 Keynote

Amazon, Google, Microsoft, and Oracle, according to the business, already have plans to include the NVL72 racks in their cloud service offerings; however, the exact number of racks that they are purchasing is unknown.

The price is also a mystery. Past chips like the H100 were priced anywhere from $25,000 to $40,000, but this had next to no impact on their demand.

As this news comes from GTC 2024, not much was revealed about the gaming offerings. However, a future desktop graphics card lineup classified as the RTX 50 series will probably be powered by the Blackwell GPU architecture.

Until then, we recommend exploring Nvidia’s GeForce RTX 40 Super lineup, which was released earlier this year.

Was our article helpful? 👨‍💻

Thank you! Please share your positive feedback. 🔋

How could we improve this post? Please Help us. 😔

Malik Usman
[How-To Guides Expert] Laiba, our tech guru at HI Digital, simplifies and crafts How-To Guides with a Google IT Support Certificate. Beyond tech, she captures the city's aesthetics through her lens. Join her journey where tech meets creativity! Get In Touch: Laiba@hidgt.com Google IT Certificate Shehryar Khan