Saturday, 28 Feb 2026
Subscribe
logo
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Font ResizerAa
Data Center NewsData Center News
Search
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Data Center News > Blog > Innovations > University of Minnesota device slashes AI energy consumption
Innovations

University of Minnesota device slashes AI energy consumption

Last updated: July 29, 2024 4:47 pm
Published July 29, 2024
Share
AI energy consumption
SHARE

Researchers on the College of Minnesota Twin Cities have developed a cutting-edge {hardware} system that would dramatically scale back AI power consumption by an element of at the least 1,000.

This breakthrough represents a big leap ahead within the quest for extra energy-efficient AI purposes.

Addressing the power calls for of AI

With AI purposes more and more prevalent, there’s a urgent want to reinforce power effectivity with out compromising efficiency or escalating prices.

Conventional AI processes devour huge quantities of energy by consistently transferring knowledge between logic (processing) and reminiscence (storage).

The College of Minnesota’s new mannequin, referred to as computational random-access memory (CRAM), addresses this problem by conserving knowledge throughout the reminiscence for processing.

“This work is the primary experimental demonstration of CRAM, the place knowledge will be processed totally throughout the reminiscence array with no need to depart the grid the place a pc shops data,” defined Yang Lv, a postdoctoral researcher within the Division of Electrical and Laptop Engineering and lead writer of the examine.

CRAM: A game-changer in AI power effectivity

The Worldwide Vitality Company (IEA) predicts that AI energy consumption will double from 460 terawatt-hours (TWh) in 2022 to 1,000 TWh in 2026, similar to Japan’s whole electrical energy consumption.

CRAM-based machine studying inference accelerators might obtain power enhancements of as much as 1,000 instances, with some purposes seeing power financial savings of two,500 and 1,700 instances in comparison with conventional strategies.

“Our preliminary idea to make use of reminiscence cells straight for computing 20 years in the past was thought-about loopy,” stated Jian-Ping Wang, senior writer of the paper and a Distinguished McKnight Professor on the College of Minnesota.

See also  Dominion Energy: Powering The Data Center Capital Of The World (NYSE:D)

The interdisciplinary workforce, comprising consultants from physics, supplies science, laptop science, and engineering, has been growing this know-how since 2003.

The analysis builds on patented work into Magnetic Tunnel Junctions (MTJs), nanostructured gadgets utilized in laborious drives, sensors, and different microelectronics methods, together with Magnetic Random Entry Reminiscence (MRAM).

CRAM leverages these developments to carry out computations straight inside reminiscence cells, eliminating gradual and energy-intensive knowledge transfers typical of conventional architectures.

Breaking the von Neumann bottleneck

CRAM structure overcomes the bottleneck of the normal von Neumann structure, the place computation and reminiscence are separate entities.

“CRAM could be very versatile; computation will be carried out in any location within the reminiscence array,” stated Ulya Karpuzcu, an Affiliate Professor and knowledgeable on computing structure.

This flexibility permits CRAM to match the efficiency wants of assorted AI algorithms extra effectively than conventional methods.

CRAM makes use of considerably much less power than present random entry reminiscence (RAM) gadgets, which depend on a number of transistors to retailer knowledge.

By using MTJs—a kind of spintronic system that makes use of electron spin as a substitute {of electrical} cost—CRAM supplies a extra environment friendly different to conventional transistor-based chips.

The College of Minnesota workforce is now collaborating with semiconductor trade leaders to scale up demonstrations and produce the {hardware} essential to cut back AI power consumption on a bigger scale.

The event of CRAM know-how represents a monumental step in direction of sustainable AI computing.

By dramatically lowering AI power consumption whereas sustaining excessive efficiency, this innovation guarantees to satisfy the rising calls for of AI purposes and pave the way in which for a extra environment friendly and environmentally pleasant future.

See also  Customizable carbon and its potential impact on green energy

Source link

TAGGED: consumption, Device, Energy, Minnesota, Slashes, University
Share This Article
Twitter Email Copy Link Print
Previous Article Cloud Computing News Alibaba transforms the 2024 Olympic broadcasting with cloud and AI services
Next Article EXANTE EXANTE Trading Platform Review: Scam Or Legit?
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Your Trusted Source for Accurate and Timely Updates!

Our commitment to accuracy, impartiality, and delivering breaking news as it happens has earned us the trust of a vast audience. Stay ahead with real-time updates on the latest events, trends.
FacebookLike
TwitterFollow
InstagramFollow
YoutubeSubscribe
LinkedInFollow
MediumFollow
- Advertisement -
Ad image

Popular Posts

US Energy Experts to Present Latest Insights at Data Center World 2024 | DCN

World-leading energy experts have been confirmed among the list of speakers for Data Center World…

January 23, 2024

X and xAI sue Apple and OpenAI over AI monopoly claims

Elon Musk’s X and xAI are taking up Apple and OpenAI, accusing the tech giants…

August 31, 2025

RyboDyn Raises $4M in Pre-Seed Funding

RyboDyn, a San Diego, CA-based biotechnology firm growing immunotherapies focusing on the darkish genome, raised…

January 12, 2025

NTT Data and Palo Alto deploy AI-powered cloud-to-edge cybersecurity

NTT Knowledge has partnered with Palo Alto Networks to reinforce cybersecurity for enterprises by means…

November 4, 2024

Here’s the Only Artificial Intelligence (AI) Stock That Warren Buffett and Cathie Wood Both Own As 2024 Begins

Warren Buffett and Cathie Wood are like two peas in a pod. At least, that's…

January 22, 2024

You Might Also Like

AI data centres
Innovations

ORNL institute to address power demand from AI data centres

By saad
£76m for national compute to solve critical industry challenges
Innovations

£76m for national compute to solve critical industry challenges

By saad
NPL upgrades UK Network Time Protocol services
Innovations

NPL upgrades UK Network Time Protocol services

By saad
High-performance computing market set to reach $91bn by 2030, report reveals
Innovations

High-performance computing market set to reach $91bn by 2030

By saad
Data Center News
Facebook Twitter Youtube Instagram Linkedin

About US

Data Center News: Stay informed on the pulse of data centers. Latest updates, tech trends, and industry insights—all in one place. Elevate your data infrastructure knowledge.

Top Categories
  • Global Market
  • Infrastructure
  • Innovations
  • Investments
Usefull Links
  • Home
  • Contact
  • Privacy Policy
  • Terms & Conditions

© 2024 – datacenternews.tech – All rights reserved

Welcome Back!

Sign in to your account

Lost your password?
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.
You can revoke your consent any time using the Revoke consent button.