Deep Dives

Nvidia's next-gen AI superchip

August 9, 2023
We summarized this source into key points to remember. To know more about it, please click on the link above.

Receive a daily summary of what happened in tech, powered by ML and AI.

Thank you! We sent you a verification email.
Oops! Something went wrong while submitting the form.
Join 1,500+ thinkers, builders and investors.
Nvidia has introduced a new AI chip platform built on the Grace Hopper Superchip, featuring the pioneering HBM3e processor. This platform is tailored for advanced AI tasks like large language models, delivering significant memory and bandwidth enhancements.

The Next Generation Features:
  • Nvidia's GH200 Grace Hopper platform focuses on demanding AI tasks like those of ChatGPT.
  • The chip handles other AI applications such as recommender systems and vector databases.
  • Dual configuration provides 3.5x more memory and 3x higher bandwidth than its predecessor.
  • The server includes 144 Arm Neoverse cores, eight petaflops of AI performance, and 282GB of HBM3e memory tech.

  • CEO's Insights and Connectivity:
  • CEO Jensen Huang emphasizes the enhancements in throughput, connectivity of GPUs, and a versatile server design.
  • The Grace Hopper Superchip can be linked with other Superchips via Nvidia NVLink, bolstering the computing power for larger AI models.
  • Such a connection grants the GPU full access to the CPU memory, resulting in a combined memory of 1.2TB in dual configuration.

  • Performance and Compatibility:
  • The HBM3e memory boasts a speed that's 50% faster than HBM3.
  • It delivers a combined bandwidth of 10TB/sec, enabling it to support models that are 3.5x bigger.
  • Thanks to the 3x greater bandwidth, the overall performance sees a boost.
  • The Grace Hopper Superchip with HBM3e is in line with Nvidia's MGX server specifications.
  • Manufacturers can easily integrate Grace Hopper into a variety of server models swiftly and economically.

  • Historical and Future Outlook:
  • Nvidia maintains its stronghold in the AI hardware sector; its A100 GPUs were instrumental in powering ChatGPT.
  • This was succeeded by the H100 model.
  • Nvidia anticipates that the initial systems based on Grace Hopper will emerge in the second quarter of 2024.

  • Did you like this article? 🙌

    Receive a daily summary of the best tech news from 50+ media (The Verge, Tech Crunch...).
    Thank you! We sent you a verification email.
    Oops! Something went wrong while submitting the form.
    Join 1,500+ thinkers, builders and investors.
    You're in! Thanks for subscribing to Techpresso :)
    Oops! Something went wrong while submitting the form.
    Join 5,000+ thinkers, builders and investors.
    Also available on:

    You might also like