Thursday, 19 Feb 2026
Subscribe
logo
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Font ResizerAa
Data Center NewsData Center News
Search
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Data Center News > Blog > AI > Tokenization takes the lead in the fight for data security
AI

Tokenization takes the lead in the fight for data security

Last updated: December 15, 2025 5:26 pm
Published December 15, 2025
Share
Tokenization takes the lead in the fight for data security
SHARE

Contents
The tokenization differentiatorThe enterprise worth of tokenizationBreaking down adoption obstacles

Introduced by Capital One Software program


Tokenization is rising as a cornerstone of contemporary information safety, serving to companies separate the worth of their information from its danger. During this VB in Conversation, Ravi Raghu, president, Capital One Software program, talks in regards to the methods tokenization will help scale back the worth of breached information and protect underlying information format and usefulness, together with Capital One’s personal expertise leveraging tokenization at scale.

Tokenization, Raghu asserts, is a far superior know-how. It converts delicate information right into a nonsensitive digital substitute, known as a token, that maps again to the unique, which is secured in a digital vault. The token placeholder preserves each the format and the utility of the delicate information, and can be utilized throughout purposes — together with AI fashions. As a result of tokenization removes the necessity to handle encryption keys or dedicate compute to fixed encrypting and decrypting, it gives one of the scalable methods for firms to guard their most delicate information, he added.

“The killer half, from a safety standpoint, when you consider it relative to different strategies, if a foul actor will get maintain of the info, they pay money for tokens,” he defined. “The precise information will not be sitting with the token, in contrast to different strategies like encryption, the place the precise information sits there, simply ready for somebody to pay money for a key or use brute pressure to get to the actual information. From each angle that is the perfect approach one must go about defending delicate information.”

The tokenization differentiator

Most organizations are simply scratching the floor of information safety, including safety on the very finish, when information is learn, to forestall an finish person from accessing it. At minimal, organizations ought to give attention to securing information on write, because it’s being saved. However best-in-class organizations go even additional, defending information at delivery, the second it’s created.

See also  The case for physical isolation in data centre security

At one finish of the security spectrum is an easy lock-and-key strategy that restricts entry however leaves the underlying information intact. Extra superior strategies, like masking or modifying information, completely alter its which means — which might compromise its usefulness. File-level encryption gives broader safety for big volumes of saved information, however if you get all the way down to field-level encryption (for instance, a Social Safety quantity), it turns into an even bigger problem. It takes quite a lot of compute to encrypt a single area, after which to decrypt it on the level of utilization. And nonetheless it has a deadly flaw: the unique information continues to be proper there, solely needing the important thing to get entry.

Tokenization avoids these pitfalls by changing the unique information with a surrogate that has no intrinsic worth. If the token is intercepted — whether or not by the fallacious individual or the fallacious machine — the info itself stays safe.

The enterprise worth of tokenization

“Basically you’re defending information, and that’s priceless,” Raghu stated. “One other factor that’s priceless – can you utilize that for modeling functions subsequently? On the one hand, it’s a safety factor, and alternatively it’s a enterprise enabling factor.”

As a result of tokenization preserves the construction and ordinality of the unique information, it will probably nonetheless be used for modeling and analytics, turning safety right into a enterprise enabler. Take personal well being information ruled by HIPAA for instance: tokenization signifies that information canbeused to construct pricing fashions or for gene remedy analysis, whereas remaining compliant.

See also  Nuisance phone calls increase in Q4 and AI scams likely to go up in 2024 | Hiya

“In case your information is already protected, you’ll be able to then proliferate the utilization of information throughout all the enterprise and have all people creating increasingly worth out of the info,” Raghu stated. “Conversely, if you happen to don’t have that, there’s a number of reticence for enterprises right now to have extra individuals entry it, or have increasingly AI brokers entry their information. Satirically, they’re limiting the blast radius of innovation. The tokenization impression is very large, and there are lots of metrics you can use to measure that – operational impression, income impression, and clearly the peace of thoughts from a safety standpoint.”

Breaking down adoption obstacles

Till now, the elemental problem with conventional tokenization has been efficiency. AI requires a scale and pace that’s unprecedented. That is one of many main challenges Capital One addresses with Databolt, its vaultless tokenization resolution, which might produce as much as 4 million tokens per second.

“Capital One has gone by way of tokenization for greater than a decade. We began doing it as a result of we’re serving our 100 million banking prospects. We wish to shield that delicate information,” Raghu stated. “We’ve eaten our personal pet food with our inner tokenization functionality, over 100 billion occasions a month. We’ve taken that know-how and that functionality, scale, and pace, and innovated in order that the world can leverage it, in order that it’s a industrial providing.”

Vaultless tokenization is a complicated type of tokenization that doesn’t require a central database (vault) to retailer token mappings. As an alternative, it makes use of mathematical algorithms, cryptographic methods, and deterministic mapping to generate tokens dynamically.This strategy is quicker, extra scalable, and eliminates the safety danger related to managing a vault.

See also  DeepSeek is a reminder to approach the AI unknown with caution

“We realized that for the dimensions and pace calls for that we had, we wanted to construct out that functionality ourselves,” Raghu stated. “We’ve been iterating constantly on ensuring that it will probably scale as much as tons of of billions of operations a month. All of our innovation has been round constructing IP and functionality to do this factor at a battle-tested scale inside our enterprise, for the aim of serving our prospects.”

Whereas standard tokenization strategies can contain some complexity and decelerate operations, Databolt seamlessly integrates with encrypted information warehouses, permitting companies to take care of sturdy safety with out slowing efficiency or operations. Tokenization happens within the buyer’s setting, eradicating the necessity to talk with an exterior community to carry out tokenization operations, which might additionally sluggish efficiency.

“We imagine that essentially, tokenization ought to be straightforward to undertake,” Raghu stated. “It is best to be capable to safe your information in a short time and function on the pace and scale and value wants that organizations have. I feel that’s been a vital barrier up to now for the mass scale adoption of tokenization. In an AI world, that’s going to change into an enormous enabler.”

Do not miss the whole conversation with Ravi Raghu, president, Capital One Software, here.


Sponsored articles are content material produced by an organization that’s both paying for the put up or has a enterprise relationship with VentureBeat, and so they’re all the time clearly marked. For extra data, contact gross sales@venturebeat.com.

Source link

TAGGED: data, fight, Lead, security, Takes, Tokenization
Share This Article
Twitter Email Copy Link Print
Previous Article Siemens and nVent: pioneering liquid cooling solutions for AI data centres Siemens and nVent: pioneering liquid cooling solutions for AI data centres
Next Article AWS's legacy will be in AI success AWS’s legacy will be in AI success
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Your Trusted Source for Accurate and Timely Updates!

Our commitment to accuracy, impartiality, and delivering breaking news as it happens has earned us the trust of a vast audience. Stay ahead with real-time updates on the latest events, trends.
FacebookLike
TwitterFollow
InstagramFollow
YoutubeSubscribe
LinkedInFollow
MediumFollow
- Advertisement -
Ad image

Popular Posts

Cybersecurity leadership and evolving threats

Greg van der Gaast is a pioneering cybersecurity speaker and thought chief identified for his…

April 13, 2025

Nxtra by Airtel powers up east Africa’s greenest data centre hub

Within the coronary heart of East Africa, Tatu Metropolis Particular Financial Zone (SEZ) is quick…

September 10, 2025

Podcast #127 – Backblaze Drive Report

Brian welcomes Andy Klein to the Podcast this week. Andy is the Precept Storage Cloud…

May 24, 2024

NVIDIA aims to solve AI’s issues with many languages

Whereas AI may really feel ubiquitous, it primarily operates in a tiny fraction of the…

August 15, 2025

Mass production may soon make ultra-large nano transparent screens accessible to everyone

Picture of the large-sized NTS, collectively developed by the KIMM and Meta2People, being demonstrated (clear).…

July 5, 2024

You Might Also Like

Infosys AI implementation framework offers business leaders guidance
AI

Infosys AI implementation framework offers business leaders guidance

By saad
Biometric passwordless login and EU digital wallet security platform
Innovations

Biometric passwordless login and EU digital wallet security platform

By saad
Europe’s data centre outlook: data centre truths 2026
Power & Cooling

Europe’s data centre outlook: data centre truths 2026

By saad
How financial institutions are embedding AI decision-making
AI

How financial institutions are embedding AI decision-making

By saad
Data Center News
Facebook Twitter Youtube Instagram Linkedin

About US

Data Center News: Stay informed on the pulse of data centers. Latest updates, tech trends, and industry insights—all in one place. Elevate your data infrastructure knowledge.

Top Categories
  • Global Market
  • Infrastructure
  • Innovations
  • Investments
Usefull Links
  • Home
  • Contact
  • Privacy Policy
  • Terms & Conditions

© 2024 – datacenternews.tech – All rights reserved

Welcome Back!

Sign in to your account

Lost your password?
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.
You can revoke your consent any time using the Revoke consent button.