Micron Technology, Inc. announced that its 12-layer HBM3E 36GB products have been selected by AMD for its next-generation AMD Instinct™ MI350 series solutions. The collaboration will play a key role in improving power efficiency and performance for large-scale AI model training, enabling complex HPC workloads including high-throughput inference, data processing and computational modeling. Additionally, this announcement represents a key milestone in Micron’s HBM industry leadership, embodying the value of Micron’s execution excellence and strong customer relationships.
Micron’s 12-layer HBM3E 36GB solution provides industry-leading memory technology to the AMD Instinct™ MI350 series GPU platform, delivering superior bandwidth and low power consumption.*1 Based on AMD’s advanced CDNA 4 architecture, the AMD Instinct MI350 series GPU platform integrates 288GB of high-bandwidth HBM3E memory, delivering superior throughput with up to 8TB/s bandwidth. This large memory capacity enables the Instinct MI350 series GPU to efficiently process AI models with up to 520 billion parameters on a single GPU. In addition, the full platform configuration features up to 2.3TB (terabytes) of HBM3E memory, achieving theoretical peak performance of up to 161PFLOPS with FP4 precision, providing high power efficiency and scalability for high-density AI workloads. This highly integrated architecture, combined with Micron’s power-efficient HBM3E, delivers superior throughput for large-scale language model training, inference and scientific and engineering simulations, allowing data centers to maximize compute performance per watt while providing elastic scalability. Micron and AMD’s collaboration accelerates time to market for AI solutions.
“Our close collaboration and joint engineering efforts with AMD ensure that Micron’s 12-layer HBM3E 36GB product is highly compatible with AMD’s Instinct MI350 series GPUs and platforms,” said Praveen Vaidyanathan, vice president and general manager, Cloud Memory Products, Micron. “Micron’s industry-leading technology innovation in HBM3E not only delivers high performance for demanding AI systems, but also provides the benefit of reduced TCO for end customers.”
こちらもお読みください: GNext、Eleveight AIとGPUクラウド開発で提携
“Micron’s 12-layer HBM3E 36GB products play a key role in maximizing the performance and power efficiency of our AMD Instinct™ MI350 series accelerators,” said Josh Friedrich, corporate vice president, AMD Instinct Product Engineering. “Our continued collaboration with Micron will further advance low-power, high-bandwidth memory to enable customers to train larger AI models, speed inference and complex HPC workloads.”
Micron‘s 12-layer HBM3E 36GB product has been certified on multiple major AI platforms.
ソース PRタイムズ