NVIDIA's Grace CPU C1 Gains Traction in Edge Computing and Telco

Terrill Dicki   May 19, 2025 09:53  UTC 01:53

0 Min Read

NVIDIA's latest innovation, the Grace CPU C1, is garnering significant attention in the tech world, particularly in the edge computing, telecommunications, and storage sectors. This development was highlighted at the recent COMPUTEX trade show in Taipei, showcasing strong support from leading original design manufacturers, according to blogs.nvidia.com.

Expanding Role in AI Workloads

The NVIDIA Grace CPU lineup, including the Grace Hopper Superchip and the flagship Grace Blackwell platform, is proving crucial in delivering efficiency and performance gains for enterprises tackling demanding AI workloads. As AI technology continues to evolve rapidly, energy efficiency has become a critical factor in data center design, particularly for applications such as large language models and complex simulations.

The NVIDIA Grace architecture is addressing these challenges head-on. The Grace Blackwell NVL72, a rack-scale system integrating 36 Grace CPUs and 72 Blackwell GPUs, has been adopted by major cloud providers to enhance AI training and inference, including complex reasoning and physical AI tasks.

Configurations and Efficiency

The Grace architecture is available in two configurations: the dual-CPU Grace Superchip and the new single-CPU Grace CPU C1. The latter is gaining traction in edge, telco, storage, and cloud deployments, where maximizing performance per watt is essential. The Grace CPU C1 reportedly offers twice the energy efficiency compared to traditional CPUs, a significant advantage in distributed and power-constrained environments.

Key manufacturers such as Foxconn, Jabil, Lanner, MiTAC Computing, Supermicro, and Quanta Cloud Technology are developing systems that leverage the Grace CPU C1's capabilities. In telecommunications, the NVIDIA Compact Aerial RAN Computer, which incorporates the Grace CPU C1 with an NVIDIA L4 GPU and NVIDIA ConnectX-7 SmartNIC, is emerging as a platform for distributed AI-RAN, meeting the power, performance, and size requirements for deployment at cell sites.

Adoption in Storage Solutions

Beyond telecommunications, NVIDIA Grace is making inroads into storage solutions. Companies like WEKA and Supermicro are deploying it for its high performance and memory bandwidth capabilities.

Real-World Impact

The benefits of NVIDIA Grace are evident in real-world applications. ExxonMobil is utilizing the Grace Hopper for seismic imaging, processing large datasets to gain insights into subsurface features and geological formations. Meta is deploying Grace Hopper for ad serving and filtering, leveraging the high-bandwidth NVIDIA NVLink-C2C interconnect between the CPU and GPU to manage extensive recommendation tables. Additionally, high-performance computing centers, such as the Texas Advanced Computing Center and Taiwan's National Center for High-Performance Computing, are incorporating the Grace CPU into their systems to advance AI and simulation research.



Read More