Sunday, 9 Nov 2025
Subscribe
logo
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Font ResizerAa
Data Center NewsData Center News
Search
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Data Center News > Blog > Global Market > Micron Unveils 192GB Low-Power Memory Module for AI Data Centers
Global Market

Micron Unveils 192GB Low-Power Memory Module for AI Data Centers

Last updated: October 26, 2025 10:41 pm
Published October 26, 2025
Share
Micron Unveils 192GB Low-Power Memory Module for AI Data Centers
SHARE

Micron Expertise is increasing its foothold within the quickly rising synthetic intelligence {hardware} market with a brand new era of low-power reminiscence modules aimed toward bettering the effectivity and scalability of AI information facilities. The corporate has begun buyer sampling of its 192GB SOCAMM2 (Small Define Compression Hooked up Reminiscence Module).

It is a next-generation low-power DRAM resolution designed to fulfill the growing efficiency and vitality calls for of AI workloads.

As AI techniques evolve to course of bigger datasets and extra advanced fashions, reminiscence has change into one of the crucial parts of knowledge heart infrastructure. The SOCAMM2 module builds on Micron’s first-generation LPDRAM SOCAMM structure, providing 50 p.c larger capability in the identical compact type issue. In keeping with Micron, this development considerably enhances the power of AI servers to deal with real-time inference duties, with the potential to scale back time-to-first-token (TTFT) latency by greater than 80 p.c in some functions.

The brand new module additionally delivers greater than 20 p.c larger energy effectivity, because of Micron’s newest 1-gamma DRAM manufacturing course of. This enchancment may have vital implications for hyperscale AI deployments, the place rack-level configurations can embrace tens of terabytes of CPU-attached reminiscence. Decrease energy draw interprets straight into lowered operational prices and smaller carbon footprints – key concerns for operators searching for to stability development with sustainability.

Micron’s work with low-power DRAM (LPDRAM) expertise builds on a five-year collaboration with NVIDIA, one of many main forces in AI computing. The SOCAMM2 modules convey the excessive bandwidth and low energy consumption historically related to cellular LPDDR5X expertise to the info heart, adapting it for the rigorous calls for of large-scale AI inference and coaching environments. The result’s a high-throughput, energy-efficient reminiscence system tailor-made for next-generation AI servers, designed to fulfill the wants of fashions with large contextual information necessities.

See also  Challenges In Implementing Liquid Cooling At Data Centers

The corporate’s newest innovation is a part of a broader business development towards optimizing information heart {hardware} for AI workloads. With power-hungry generative AI techniques now driving infrastructure growth, the necessity for energy-efficient parts has change into a prime precedence. Reminiscence efficiency straight impacts mannequin responsiveness, and bottlenecks in information switch or latency can considerably degrade throughput throughout massive clusters. Micron’s SOCAMM2 addresses these challenges with its compact type issue – one-third the dimensions of a regular RDIMM – whereas growing whole capability, bandwidth, and thermal efficiency. The smaller footprint additionally permits for extra versatile server designs, together with liquid-cooled configurations aimed toward managing the thermal load of dense AI compute environments.

Micron has emphasised that SOCAMM2 modules meet information center-class high quality and reliability requirements, benefiting from the corporate’s long-standing experience in high-performance DDR reminiscence. Specialised testing and design variations make sure that the modules preserve consistency and endurance beneath sustained, high-intensity workloads.

Along with product improvement, Micron is taking part in a job in shaping business requirements. The corporate is actively collaborating in JEDEC’s ongoing work to outline SOCAMM2 specs and collaborating with companions throughout the ecosystem to speed up the adoption of low-power DRAM applied sciences in AI information facilities.

Micron is at present transport SOCAMM2 samples to clients in capacities of as much as 192GB per module, working at speeds of as much as 9.6Gbps. Full-scale manufacturing is predicted to align with shopper system launch timelines later this 12 months.

The introduction of SOCAMM2 underscores how power-efficient reminiscence is turning into central to the following section of AI infrastructure design. As hyperscalers and enterprise operators search to construct sooner, greener information facilities, Micron’s newest innovation would sign a shift towards {hardware} architectures optimized for each efficiency and sustainability.

See also  Are Wooden Data Centers the Next Big Innovation in Tech Infrastructure?

By leveraging many years of semiconductor experience and its deep involvement within the AI ecosystem, Micron is positioning SOCAMM2 as a foundational part within the business’s transfer towards extra environment friendly, high-capacity AI computing platforms.

Source link

TAGGED: 192GB, Centers, data, lowpower, memory, Micron, module, unveils
Share This Article
Twitter Email Copy Link Print
Previous Article Inside Ring-1T: Ant engineers solve reinforcement learning bottlenecks at trillion scale Inside Ring-1T: Ant engineers solve reinforcement learning bottlenecks at trillion scale
Next Article Businesses still face the AI data challenge Businesses still face the AI data challenge
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Your Trusted Source for Accurate and Timely Updates!

Our commitment to accuracy, impartiality, and delivering breaking news as it happens has earned us the trust of a vast audience. Stay ahead with real-time updates on the latest events, trends.
FacebookLike
TwitterFollow
InstagramFollow
YoutubeSubscribe
LinkedInFollow
MediumFollow
- Advertisement -
Ad image

Popular Posts

Codestone Acquires Cloud Business

Codestone, a Poole, UK-based enterprise useful resource planning (ERP) and cloud database applied sciences firm,…

February 18, 2024

CBiGroup Closes $15M Series A Funding

CBiGroup, a NYC-based world monetary companies firm for companies, raised $15M in Sequence A funding…

May 24, 2024

New Zealand’s Infratil to raise $704 mln to fund data centre investment, ETCIO SEA

Consultant pictureJune 17 (Reuters) - New Zealand's infrastructure investor Infratil is aiming to lift NZ$1.15…

June 17, 2024

Infineon and Delta forge ahead with power modules for AI data centres

Infineon Applied sciences AG has introduced an enlargement in its collaboration with Delta Electronics to…

September 1, 2025

Grammarly Raises $1 Billion in Growth Financing

Grammarly, a San Francisco, CA-based supplier of an AI assistant for communication and productiveness, raised…

June 1, 2025

You Might Also Like

LLMs, ChatGPT, Generative AI
Global Market

Perplexity’s open-source tool to run trillion-parameter models without costly upgrades

By saad
Gartner: Regular AI Audits Triple Generative AI Business Value
Global Market

Gartner: Regular AI Audits Triple Generative AI Business Value

By saad
A software developer selects a certification from their touchscreen.
Global Market

Cisco launches AI infrastructure, AI practitioner certifications

By saad
BuiltFast, Ymir Partner Launch ‘Vector’ - Serverless Hosting for WordPress
Global Market

BuiltFast, Ymir Partner Launch ‘Vector’ – Serverless Hosting for WordPress

By saad
Data Center News
Facebook Twitter Youtube Instagram Linkedin

About US

Data Center News: Stay informed on the pulse of data centers. Latest updates, tech trends, and industry insights—all in one place. Elevate your data infrastructure knowledge.

Top Categories
  • Global Market
  • Infrastructure
  • Innovations
  • Investments
Usefull Links
  • Home
  • Contact
  • Privacy Policy
  • Terms & Conditions

© 2024 – datacenternews.tech – All rights reserved

Welcome Back!

Sign in to your account

Lost your password?
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.
You can revoke your consent any time using the Revoke consent button.