Friday, 6 Mar 2026
Subscribe
logo
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Font ResizerAa
Data Center NewsData Center News
Search
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Data Center News > Blog > Innovations > New TUM training model slashes AI energy consumption
Innovations

New TUM training model slashes AI energy consumption

Last updated: March 7, 2025 6:54 pm
Published March 7, 2025
Share
AI energy consumption
SHARE

Nonetheless, the vitality consumption of AI programs, significantly massive language fashions (LLMs), has raised issues about sustainability.

These programs depend on knowledge centres, which require huge quantities of electrical energy for computing, storage, and knowledge transmission. In Germany alone, knowledge centres consumed roughly 16 billion kWh in 2020 – accounting for round 1% of the nation’s complete vitality utilization.

By 2025, this determine is projected to rise to 22 billion kWh, reflecting the rising demand for AI-powered providers.

To fight this concern, consultants on the Technical College of Munich (TUM) have developed a novel coaching methodology that slashes AI vitality consumption considerably.

What drives AI vitality consumption?

It’s more and more evident that the vitality consumption of AI poses a major environmental problem.

The core of this concern lies within the immense computational energy required to coach and function superior AI fashions. These fashions necessitate processing huge datasets, resulting in extended and intensive use of highly effective {hardware} equivalent to GPUs and TPUs, which devour massive quantities of electrical energy.

This excessive vitality demand is additional amplified by the reliance on AI operations in knowledge centres, which require substantial energy for each computation and cooling.

In line with analysis from sources like Constructed In, the vitality used to provide a single picture from an AI picture generator can equal the energy used to fully charge a smartphone. This provides a tangible instance of the facility consumption of AI.

Moreover, the Worldwide Vitality Company (IEA) has highlighted that interactions with AI programs like ChatGPT may devour considerably extra electrical energy than customary search engine queries.

See also  Confronting the alarming rise of supply chain attacks

The IEA additionally states that the rise in electrical energy consumption by knowledge centres, cryptocurrencies and AI between 2022 and 2026 could possibly be equal to the electrical energy consumption of Sweden or Germany. This emphasises the size of AI vitality consumption.

Moreover, reviews challenge a considerable enhance in knowledge centre vitality consumption within the coming years, pushed largely by the proliferation of AI.

For instance, McKinsey & Firm have projected that energy demand for knowledge centres in the USA is expected to reach 606 terawatt-hours (TWh) by 2030, up from 147 TWh in 2023. This projected enhance exhibits the quickly rising demand for vitality from AI.

To handle this problem, TUM researchers have developed a revolutionary coaching methodology that’s 100 occasions sooner whereas sustaining accuracy akin to present strategies.

This breakthrough has the potential to considerably scale back AI vitality consumption, making large-scale AI adoption extra sustainable.

Understanding  neural networks

AI programs depend on synthetic neural networks, that are impressed by the human mind. These networks include interconnected nodes – synthetic neurons – that course of enter alerts.

Every connection is weighted with particular parameters, and when the enter exceeds a threshold, the sign is handed ahead.

Coaching a neural community entails adjusting these parameters by means of repeated iterations to enhance predictions. Nonetheless, this course of is computationally costly and contributes to excessive electrical energy utilization.

A extra environment friendly coaching methodology

Felix Dietrich, a professor specialising in physics-enhanced machine studying, and his analysis staff have launched an revolutionary strategy to neural community coaching.

See also  Harnessing clean energy for Britain's data-driven future

As a substitute of counting on conventional iterative strategies, their approach employs probabilistic parameter choice.

This methodology focuses on figuring out important factors in coaching knowledge – the place fast and vital modifications happen – and strategically assigning values based mostly on chance distributions.

By concentrating on key areas within the dataset, this strategy dramatically reduces the variety of required iterations, resulting in substantial vitality financial savings.

Actual-world functions

This new coaching approach holds immense potential for quite a lot of functions. Vitality-efficient AI fashions could possibly be utilized in local weather modelling, monetary market evaluation, and different dynamic programs that require fast knowledge processing.

By lowering the vitality footprint of AI coaching, this methodology not solely lowers operational prices but in addition aligns AI improvement with international sustainability objectives.

A greener AI future

The fast enlargement of AI functions necessitates a sustainable strategy to vitality consumption.

With knowledge centre electrical energy utilization anticipated to rise, adopting energy-efficient coaching strategies is essential. The breakthrough by the TUM staff marks a major step in the direction of making AI extra environmentally pleasant with out compromising efficiency.

Because the expertise evolves, improvements like this can play a pivotal function in shaping a extra sustainable digital future.

Source link

TAGGED: consumption, Energy, Model, Slashes, training, TUM
Share This Article
Twitter Email Copy Link Print
Previous Article Women in technology in 2025 on breaking the bias Women in technology in 2025 on breaking the bias
Next Article Xen 4.20 Released: Improved Security, Performance, Architecture Support Xen 4.20 Released: Improved Security, Performance, Architecture Support
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Your Trusted Source for Accurate and Timely Updates!

Our commitment to accuracy, impartiality, and delivering breaking news as it happens has earned us the trust of a vast audience. Stay ahead with real-time updates on the latest events, trends.
FacebookLike
TwitterFollow
InstagramFollow
YoutubeSubscribe
LinkedInFollow
MediumFollow
- Advertisement -
Ad image

Popular Posts

Compact phononic circuits guide sound at gigahertz frequencies for chip-scale devices

Topological phononic chip platform. a, Illustration of built-in units that use topologically protected sound waves,…

September 19, 2025

Ostra Security Raises Extension of Series A Financing

Ostra Security, a Minneapolis, MN-based supplier of managed safety options, raised an undisclosed quantity in…

August 10, 2025

The US proposes rules to make healthcare data more secure

The US Division of Well being and Human Companies’ (HHS) Workplace for Civil Rights (OCR)…

December 29, 2024

Avant Technologies to Build First AI-Focused Data Center

LAS VEGAS, March 25, 2024 (GLOBE NEWSWIRE) -- Avant Applied sciences, Inc. (OTCQB: AVAI) (“Avant” or…

March 25, 2024

Anthropic just made it harder for AI to go rogue with its updated safety policy

Be part of our each day and weekly newsletters for the most recent updates and…

October 15, 2024

You Might Also Like

High-performance large language models for Europe
Innovations

High-performance large language models for Europe

By saad
An Estonian large language model for sovereign AI infrastructure
Innovations

An Estonian large language model for sovereign AI infrastructure

By saad
Livermore Computing: Accelerating excellence in HPC
Innovations

Livermore Computing: Accelerating excellence in HPC

By saad
European defence
Innovations

Inside the EU’s military innovation push

By saad
Data Center News
Facebook Twitter Youtube Instagram Linkedin

About US

Data Center News: Stay informed on the pulse of data centers. Latest updates, tech trends, and industry insights—all in one place. Elevate your data infrastructure knowledge.

Top Categories
  • Global Market
  • Infrastructure
  • Innovations
  • Investments
Usefull Links
  • Home
  • Contact
  • Privacy Policy
  • Terms & Conditions

© 2024 – datacenternews.tech – All rights reserved

Welcome Back!

Sign in to your account

Lost your password?
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.
You can revoke your consent any time using the Revoke consent button.