-
- News
- Books
Featured Books
- smt007 Magazine
Latest Issues
Current IssueComing to Terms With AI
In this issue, we examine the profound effect artificial intelligence and machine learning are having on manufacturing and business processes. We follow technology, innovation, and money as automation becomes the new key indicator of growth in our industry.
Box Build
One trend is to add box build and final assembly to your product offering. In this issue, we explore the opportunities and risks of adding system assembly to your service portfolio.
IPC APEX EXPO 2024 Pre-show
This month’s issue devotes its pages to a comprehensive preview of the IPC APEX EXPO 2024 event. Whether your role is technical or business, if you're new-to-the-industry or seasoned veteran, you'll find value throughout this program.
- Articles
- Columns
Search Console
- Links
- Events
||| MENU - smt007 Magazine
Micron First to Ship Critical Memory for AI Data Centers
May 1, 2024 | MicronEstimated reading time: 3 minutes
Micron Technology, Inc. announced it is leading the industry by validating and shipping its high-capacity monolithic 32Gb DRAM die-based 128GB DDR5 RDIMM memory in speeds up to 5,600 MT/s on all leading server platforms. Powered by Micron’s industry-leading 1β (1-beta) technology, the 128GB DDR5 RDIMM memory delivers more than 45% improved bit density,1 up to 22% improved energy efficiency2 and up to 16% lower latency1 over competitive 3DS through-silicon via (TSV) products.
Micron’s collaboration with industry leaders and customers has yielded broad adoption of these new high-performance, large-capacity modules across high-volume server CPUs. These high-speed memory modules were engineered to meet the performance needs of a wide range of mission-critical applications in data centers, including artificial intelligence (AI) and machine learning (ML), high-performance computing (HPC), in-memory databases (IMDBs) and efficient processing for multithreaded, multicore count general compute workloads. Micron’s 128GB DDR5 RDIMM memory will be supported by a robust ecosystem including AMD, Hewlett Packard Enterprise (HPE), Intel, Supermicro, along with many others.
“With this latest volume shipment milestone, Micron continues to lead the market in providing high-capacity RDIMMs that have been qualified on all the major CPU platforms to our customers,” said Praveen Vaidyanathan, vice president and general manager of Micron’s Compute Products Group. “AI servers will now be configured with Micron’s 24GB 8-high HBM3E for GPU-attached memory and Micron’s 128GB RDIMMs for CPU-attached memory to deliver the capacity, bandwidth and power-optimized infrastructure required for memory intensive workloads.”
Industry Adoption
“A core tenet of our work with Micron is advancing the capabilities of data center infrastructure through highly-performant memory for compute intensive workloads,” said Dan McNamara, senior vice president and general manager, Server Business Unit, AMD. “Through this collaboration, our joint customers can now get immediate impact out of the high-capacity DDR5 memory offering from Micron in an AMD EPYC CPU powered server, delivering the performance and efficiency needed for the modern data center.”
“Adopting advanced memory capabilities, while ensuring high-performance and efficiency, is critical to supporting growing AI workloads in training, tuning, and inferencing,” Krista Satterthwaite, senior vice president and general manager, Compute at HPE. “We are committed to providing the most high-performing, energy-efficient solutions, and through our collaboration with Micron, plan to deliver monolithic, high-density DRAM across our AI portfolio to help our enterprise customers gain optimal performance to tackle any workload.”
“Micron’s 128GB DDR5 RDIMM memory is the first 32Gb monolithic DRAM-based high-capacity DIMM that has completed Intel platform memory compatibility qualification on 4th and 5th Gen Intel® Xeon® processors,” said Dr. Dimitrios Ziakas, vice president of Intel’s Memory and IO Technologies, Intel Corporation. “32Gb density based DDR5 DIMMs accelerates critical server and AI system configurations bringing forward key performance, capacity, and most importantly power efficiency benefits to Intel® Xeon® processor-based systems. We are excited to continue our collaboration with Micron to drive broad adoption of innovative products in the market that solve memory capacity and power bottlenecks for AI and server customers.”
“Supermicro is leading the industry with the broadest accelerated server and solution portfolio based on NVIDIA, AMD and Intel,” said Wally Liaw, senior vice president of Business Development and co-founder at Supermicro. “Savvy customers are looking for large memory footprint, performance, and efficiency improvements in the AI infrastructure. Customers can benefit significantly from Supermicro’s advanced GPU SuperServers with the new 32Gb monolithic DRAM-based 128GB memory, and we are excited to collaborate with Micron to enable this.”
Micron 128GB DDR5 RDIMM memory is available now directly from Micron and will be available through select global channel distributors and resellers in June 2024. As part of its comprehensive data center memory portfolio, Micron offers a wide array of memory options across DDR5 RDIMMs, MCRDIMMs, MRDIMMs, CXL and LPDDR5x form factors to allow customers to integrate optimized solutions for AI and high-performance computing (HPC) applications that suit their needs for bandwidth, capacity and power optimization.
Suggested Items
SiPearl: Partnership with Samsung Electronics for built-in HBM in Rhea
05/14/2024 | BUSINESS WIRESiPearl, the company building the high-performance low-power European microprocessor for HPC and AI inference, has signed a partnership with Samsung Electronics Co. Ltd., a world leader in advanced memory technology, to equip its Rhea series with Samsung’s advanced memory solution ideal for HPC and AI applications.
Samsung Electronics Begins Industry’s First Mass Production of 9th-Gen V-NAND
04/29/2024 | Samsung ElectronicsSamsung Electronics, the world leader in advanced memory technology, today announced that it has begun mass production for its one-terabit (Tb) triple-level cell (TLC) 9th-generation vertical NAND (V-NAND), solidifying its leadership in the NAND flash market.
Micron’s Full Suite of Automotive-Grade Solutions Qualified for Qualcomm Automotive Platforms to Power AI in Vehicles
04/17/2024 | MicronMicron Technology, Inc. announced that it has qualified a full suite of its automotive-grade memory and storage solutions for Qualcomm Technologies Inc.’s Snapdragon® Digital Chassis™, a comprehensive set of cloud-connected platforms designed to power data-rich, intelligent automotive services.
Intel Breaks Down Proprietary Walls to Bring Choice to Enterprise GenAI Market
04/10/2024 | IntelAt Intel Vision, Intel introduces the Intel® Gaudi® 3 AI accelerator, which delivers 4x AI compute for BF16, 1.5x increase in memory bandwidth, and 2x networking bandwidth for massive system scale out compared to its predecessor – a significant leap in performance and productivity for AI training and inference on popular large language models (LLMs) and multimodal models.
Intel, Ohio Supercomputer Center Double AI Processing Power with New HPC Cluster
02/20/2024 | Intel CorporationA collaboration including Intel, Dell Technologies, Nvidia and the Ohio Supercomputer Center (OSC), introduces Cardinal, a cutting-edge high-performance computing (HPC) cluster.