NVIDIA Blackwell Ultra B300 AI GPUs Rumored for GTC 2025

NVIDIA's next-generation Blackwell Ultra AI GPUs, codenamed 'B300,' are reportedly set to debut at GTC 2025, marking a new era for AI.

NVIDIA's next-generation "Blackwell Ultra" AI lineup, reportedly codenamed "B300," is expected to be unveiled at the GTC conference in March 2025. This marks the beginning of the next phase in AI technology.

NVIDIA Blackwell Ultra B300 AI GPUs Rumored for GTC 2025

Blackwell Ultra Details

The "Blackwell Ultra" series is anticipated to bring significant performance enhancements compared to the current Hopper and Blackwell GPUs. Taiwan Economic Daily reports that the supply chain is already preparing for the new lineup.

The GB300 AI server is rumored to feature a massive 1400W TDP, a substantial increase from the 1000W TDP of the Blackwell GB200. With architectural improvements, it's expected to deliver approximately 1.4 times higher FP4 performance than previous generations. Additionally, memory capacity will be increased from 192 GB to 288 GB through 12-Hi HBM3E stacks.

Potential Release Timeline and Manufacturing

While NVIDIA hasn't officially announced a release date, the "one-year cadence" suggests a potential Q4 2025 launch for the B300 series, but this could change depending on market demand.

Foxconn and Quanta are expected to be the main suppliers for NVIDIA's GB300 AI servers. Due to the high TDP, NVIDIA is said to be switching to liquid cooling, and Foxconn's expertise in this area makes it a prime supplier.

The Next Phase of AI

The "Blackwell Ultra" lineup will be NVIDIA's next leap forward in the AI market, and given the expected increase in computational power, demand for these products will likely be very high.

About the author

mgtid
Owner of Technetbook | 10+ Years of Expertise in Technology | Seasoned Writer, Designer, and Programmer | Specialist in In-Depth Tech Reviews and Industry Insights | Passionate about Driving Innovation and Educating the Tech Community Technetbook

Post a Comment

Join the conversation

Join the conversation