SoftBank has completed the world’s largest AI computing platform. It’s called the “NVIDIA DGX SuperPOD.” This platform includes over 4,000 NVIDIA Blackwell GPUs in DGX B200 systems. With more than 10,000 GPUs in total, the platform delivers a peak computing performance of 13.7 exaflops. It uses NVIDIA’s fast Quantum-2 InfiniBand networking. It also supports NVIDIA AI Enterprise, a set of tools for enterprise AI development.
The platform will first serve SB Intuitions. This is SoftBank’s AI-focused subsidiary. They aim to boost a Japanese-language large language model (LLM) project. This includes developing “Sarashina mini,” a 70-billion parameter LLM. This work aims to create more advanced and scalable models. The platform should greatly improve training efficiency and model performance. This will help Japan meet its goals in developing sovereign AI.
Also Read: Qubitcore raises Pre-Seed for Ion Trap quantum computer
SoftBank aims to expand AI use across its companies. It also wants to give access to domestic businesses and research institutions. This move shows a trend toward national-scale AI infrastructure. The goal is to boost innovation and promote technological independence.