Friday, 20 Mar 2026
Subscribe
logo
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Font ResizerAa
Data Center NewsData Center News
Search
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Data Center News > Blog > Global Market > Micron Unveils 192GB Low-Power Memory Module for AI Data Centers
Global Market

Micron Unveils 192GB Low-Power Memory Module for AI Data Centers

Last updated: October 26, 2025 10:41 pm
Published October 26, 2025
Share
Micron Unveils 192GB Low-Power Memory Module for AI Data Centers
SHARE

Micron Expertise is increasing its foothold within the quickly rising synthetic intelligence {hardware} market with a brand new era of low-power reminiscence modules aimed toward bettering the effectivity and scalability of AI information facilities. The corporate has begun buyer sampling of its 192GB SOCAMM2 (Small Define Compression Hooked up Reminiscence Module).

It is a next-generation low-power DRAM resolution designed to fulfill the growing efficiency and vitality calls for of AI workloads.

As AI techniques evolve to course of bigger datasets and extra advanced fashions, reminiscence has change into one of the crucial parts of knowledge heart infrastructure. The SOCAMM2 module builds on Micron’s first-generation LPDRAM SOCAMM structure, providing 50 p.c larger capability in the identical compact type issue. In keeping with Micron, this development considerably enhances the power of AI servers to deal with real-time inference duties, with the potential to scale back time-to-first-token (TTFT) latency by greater than 80 p.c in some functions.

The brand new module additionally delivers greater than 20 p.c larger energy effectivity, because of Micron’s newest 1-gamma DRAM manufacturing course of. This enchancment may have vital implications for hyperscale AI deployments, the place rack-level configurations can embrace tens of terabytes of CPU-attached reminiscence. Decrease energy draw interprets straight into lowered operational prices and smaller carbon footprints – key concerns for operators searching for to stability development with sustainability.

Micron’s work with low-power DRAM (LPDRAM) expertise builds on a five-year collaboration with NVIDIA, one of many main forces in AI computing. The SOCAMM2 modules convey the excessive bandwidth and low energy consumption historically related to cellular LPDDR5X expertise to the info heart, adapting it for the rigorous calls for of large-scale AI inference and coaching environments. The result’s a high-throughput, energy-efficient reminiscence system tailor-made for next-generation AI servers, designed to fulfill the wants of fashions with large contextual information necessities.

See also  Rehlko advances carbon reduction in data centres

The corporate’s newest innovation is a part of a broader business development towards optimizing information heart {hardware} for AI workloads. With power-hungry generative AI techniques now driving infrastructure growth, the necessity for energy-efficient parts has change into a prime precedence. Reminiscence efficiency straight impacts mannequin responsiveness, and bottlenecks in information switch or latency can considerably degrade throughput throughout massive clusters. Micron’s SOCAMM2 addresses these challenges with its compact type issue – one-third the dimensions of a regular RDIMM – whereas growing whole capability, bandwidth, and thermal efficiency. The smaller footprint additionally permits for extra versatile server designs, together with liquid-cooled configurations aimed toward managing the thermal load of dense AI compute environments.

Micron has emphasised that SOCAMM2 modules meet information center-class high quality and reliability requirements, benefiting from the corporate’s long-standing experience in high-performance DDR reminiscence. Specialised testing and design variations make sure that the modules preserve consistency and endurance beneath sustained, high-intensity workloads.

Along with product improvement, Micron is taking part in a job in shaping business requirements. The corporate is actively collaborating in JEDEC’s ongoing work to outline SOCAMM2 specs and collaborating with companions throughout the ecosystem to speed up the adoption of low-power DRAM applied sciences in AI information facilities.

Micron is at present transport SOCAMM2 samples to clients in capacities of as much as 192GB per module, working at speeds of as much as 9.6Gbps. Full-scale manufacturing is predicted to align with shopper system launch timelines later this 12 months.

The introduction of SOCAMM2 underscores how power-efficient reminiscence is turning into central to the following section of AI infrastructure design. As hyperscalers and enterprise operators search to construct sooner, greener information facilities, Micron’s newest innovation would sign a shift towards {hardware} architectures optimized for each efficiency and sustainability.

See also  Africa Data Centres and Blue Turtle partner

By leveraging many years of semiconductor experience and its deep involvement within the AI ecosystem, Micron is positioning SOCAMM2 as a foundational part within the business’s transfer towards extra environment friendly, high-capacity AI computing platforms.

Source link

TAGGED: 192GB, Centers, data, lowpower, memory, Micron, module, unveils
Share This Article
Twitter Email Copy Link Print
Previous Article Inside Ring-1T: Ant engineers solve reinforcement learning bottlenecks at trillion scale Inside Ring-1T: Ant engineers solve reinforcement learning bottlenecks at trillion scale
Next Article Businesses still face the AI data challenge Businesses still face the AI data challenge
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Your Trusted Source for Accurate and Timely Updates!

Our commitment to accuracy, impartiality, and delivering breaking news as it happens has earned us the trust of a vast audience. Stay ahead with real-time updates on the latest events, trends.
FacebookLike
TwitterFollow
InstagramFollow
YoutubeSubscribe
LinkedInFollow
MediumFollow
- Advertisement -
Ad image

Popular Posts

BillionToOne Raises $130M in Series D Funding

BillionToOne, a Menlo Park, CA-based precision diagnostics firm, raised $130M in Collection D funding at…

June 22, 2024

EdgeCortix introduces SAKURA-II edge AI accelerator for genAI applications

EdgeCortix, a Japanese fabless semiconductor firm, has unveiled the SAKURA-II edge AI accelerator, particularly designed…

May 31, 2024

DXI Raises Pre-Seed Financing

DXI, a Taufkirchen close to Munich, Germany-based AI-driven buyer twins firm, raised an undisclosed quantity…

May 12, 2025

Vertiv supplies Digital Realty’s first Italian data centre

Vertiv, a worldwide chief in vital digital infrastructure, lately introduced its partnership with Digital Realty…

November 11, 2025

Linux operators: Using |, >, >>, &, &&, !, =, () and many more

Within the instance beneath, the head command fails as a result of it wasn’t run…

July 14, 2024

You Might Also Like

Nvidia GTC 2026 Vera Rubin
Global Market

Nvidia overhauls the data center for OpenClaw era

By saad
Mitsubishi Electric's coolant distribution unit at Data Centre World
Power & Cooling

Mitsubishi Electric’s coolant distribution unit at Data Centre World

By saad
Antin Infrastructure Partners completes takeover of NorthC
Global Market

Antin Infrastructure Partners completes takeover of NorthC

By saad
Cloud Computing Disaster Recovery Solutions Concept - Cloud DR - Services Companies Use for the Purpose of Backing Up Resources into a Cloud Environment - 3D Illustration
Global Market

Nile adds microsegmentation and native NAC to its secure NaaS platform

By saad
Data Center News
Facebook Twitter Youtube Instagram Linkedin

About US

Data Center News: Stay informed on the pulse of data centers. Latest updates, tech trends, and industry insights—all in one place. Elevate your data infrastructure knowledge.

Top Categories
  • Global Market
  • Infrastructure
  • Innovations
  • Investments
Usefull Links
  • Home
  • Contact
  • Privacy Policy
  • Terms & Conditions

© 2024 – datacenternews.tech – All rights reserved

Welcome Back!

Sign in to your account

Lost your password?
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.
You can revoke your consent any time using the Revoke consent button.