Friday, 20 Mar 2026
Subscribe
logo
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Font ResizerAa
Data Center NewsData Center News
Search
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Data Center News > Blog > Innovations > University of Minnesota device slashes AI energy consumption
Innovations

University of Minnesota device slashes AI energy consumption

Last updated: July 29, 2024 4:47 pm
Published July 29, 2024
Share
AI energy consumption
SHARE

Researchers on the College of Minnesota Twin Cities have developed a cutting-edge {hardware} system that would dramatically scale back AI power consumption by an element of at the least 1,000.

This breakthrough represents a big leap ahead within the quest for extra energy-efficient AI purposes.

Addressing the power calls for of AI

With AI purposes more and more prevalent, there’s a urgent want to reinforce power effectivity with out compromising efficiency or escalating prices.

Conventional AI processes devour huge quantities of energy by consistently transferring knowledge between logic (processing) and reminiscence (storage).

The College of Minnesota’s new mannequin, referred to as computational random-access memory (CRAM), addresses this problem by conserving knowledge throughout the reminiscence for processing.

“This work is the primary experimental demonstration of CRAM, the place knowledge will be processed totally throughout the reminiscence array with no need to depart the grid the place a pc shops data,” defined Yang Lv, a postdoctoral researcher within the Division of Electrical and Laptop Engineering and lead writer of the examine.

CRAM: A game-changer in AI power effectivity

The Worldwide Vitality Company (IEA) predicts that AI energy consumption will double from 460 terawatt-hours (TWh) in 2022 to 1,000 TWh in 2026, similar to Japan’s whole electrical energy consumption.

CRAM-based machine studying inference accelerators might obtain power enhancements of as much as 1,000 instances, with some purposes seeing power financial savings of two,500 and 1,700 instances in comparison with conventional strategies.

“Our preliminary idea to make use of reminiscence cells straight for computing 20 years in the past was thought-about loopy,” stated Jian-Ping Wang, senior writer of the paper and a Distinguished McKnight Professor on the College of Minnesota.

See also  Scientists develop novel digital encoding system using fluorescent pixels

The interdisciplinary workforce, comprising consultants from physics, supplies science, laptop science, and engineering, has been growing this know-how since 2003.

The analysis builds on patented work into Magnetic Tunnel Junctions (MTJs), nanostructured gadgets utilized in laborious drives, sensors, and different microelectronics methods, together with Magnetic Random Entry Reminiscence (MRAM).

CRAM leverages these developments to carry out computations straight inside reminiscence cells, eliminating gradual and energy-intensive knowledge transfers typical of conventional architectures.

Breaking the von Neumann bottleneck

CRAM structure overcomes the bottleneck of the normal von Neumann structure, the place computation and reminiscence are separate entities.

“CRAM could be very versatile; computation will be carried out in any location within the reminiscence array,” stated Ulya Karpuzcu, an Affiliate Professor and knowledgeable on computing structure.

This flexibility permits CRAM to match the efficiency wants of assorted AI algorithms extra effectively than conventional methods.

CRAM makes use of considerably much less power than present random entry reminiscence (RAM) gadgets, which depend on a number of transistors to retailer knowledge.

By using MTJs—a kind of spintronic system that makes use of electron spin as a substitute {of electrical} cost—CRAM supplies a extra environment friendly different to conventional transistor-based chips.

The College of Minnesota workforce is now collaborating with semiconductor trade leaders to scale up demonstrations and produce the {hardware} essential to cut back AI power consumption on a bigger scale.

The event of CRAM know-how represents a monumental step in direction of sustainable AI computing.

By dramatically lowering AI power consumption whereas sustaining excessive efficiency, this innovation guarantees to satisfy the rising calls for of AI purposes and pave the way in which for a extra environment friendly and environmentally pleasant future.

See also  Iceotope’s Precision Liquid Cooling can significantly reduce telco operators’ total energy costs

Source link

TAGGED: consumption, Device, Energy, Minnesota, Slashes, University
Share This Article
Twitter Email Copy Link Print
Previous Article Cloud Computing News Alibaba transforms the 2024 Olympic broadcasting with cloud and AI services
Next Article EXANTE EXANTE Trading Platform Review: Scam Or Legit?
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Your Trusted Source for Accurate and Timely Updates!

Our commitment to accuracy, impartiality, and delivering breaking news as it happens has earned us the trust of a vast audience. Stay ahead with real-time updates on the latest events, trends.
FacebookLike
TwitterFollow
InstagramFollow
YoutubeSubscribe
LinkedInFollow
MediumFollow
- Advertisement -
Ad image

Popular Posts

Lantronix unveils AI-driven gunshot detection for public safety at the edge

Lantronix, a supplier of superior IoT options, has partnered with GWACS Protection to develop a…

January 9, 2025

Kevin O’Leary slams New York AG’s attempts to seize Trump’s assets: ‘This is not America’

“Shark Tank” host and investor Kevin O’Leary slammed New York Lawyer Basic Letitia James’ (D)…

March 23, 2024

Topsort Raises $20M in Series A Funding

Topsort, a San Francisco, CA-based synthetic intelligence and machine learning-based clear promoting expertise firm, raised $20m in Sequence…

March 4, 2024

Ambiq debuts AI tools to cut power and speed up edge inference

Ambiq launched two new edge AI runtime options, HeliosRT and HeliosAOT, optimized for his or…

July 31, 2025

A whole new world waiting to be dealt

AI is in just about everybody’s conversations proper now, with folks utilizing it (efficiently and…

June 15, 2024

You Might Also Like

ARCHER2 supercomputer
Innovations

ARCHER2 supercomputer generates £4.2bn for UK economy

By saad
AI supercomputer
Innovations

UK unveils £45m Sunrise AI supercomputer to accelerate fusion

By saad
The password is dying – biometrics are taking their place
Innovations

The password is dying – biometrics are taking their place

By saad
Vertiv and Zagreb University launch data centre design programme
Design

Vertiv and Zagreb University launch data centre design programme

By saad
Data Center News
Facebook Twitter Youtube Instagram Linkedin

About US

Data Center News: Stay informed on the pulse of data centers. Latest updates, tech trends, and industry insights—all in one place. Elevate your data infrastructure knowledge.

Top Categories
  • Global Market
  • Infrastructure
  • Innovations
  • Investments
Usefull Links
  • Home
  • Contact
  • Privacy Policy
  • Terms & Conditions

© 2024 – datacenternews.tech – All rights reserved

Welcome Back!

Sign in to your account

Lost your password?
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.
You can revoke your consent any time using the Revoke consent button.