Saturday, 13 Dec 2025
Subscribe
logo
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Font ResizerAa
Data Center NewsData Center News
Search
  • Global
  • AI
  • Cloud Computing
  • Edge Computing
  • Security
  • Investment
  • Sustainability
  • More
    • Colocation
    • Quantum Computing
    • Regulation & Policy
    • Infrastructure
    • Power & Cooling
    • Design
    • Innovations
    • Blog
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Data Center News > Blog > Cloud Computing > Microsoft unveils safety and security tools for generative AI
Cloud Computing

Microsoft unveils safety and security tools for generative AI

Last updated: April 1, 2024 8:02 pm
Published April 1, 2024
Share
construction site barricades
SHARE

Microsoft is including security and safety instruments to Azure AI Studio, the corporate’s cloud-based toolkit for constructing generative AI functions. The brand new instruments embrace safety in opposition to immediate injection assaults, detection of hallucinations in mannequin output, system messages to steer fashions towards secure output, mannequin security evaluations, and threat and security monitoring.

Microsoft introduced the brand new options on March 28. Security evaluations at the moment are out there in preview in Azure AI Studio. The opposite options are coming quickly, Microsoft stated. Azure AI Studio, additionally in preview, could be accessed from ai.azure.com.

Immediate shields will detect and block injection assaults and embrace a brand new mannequin to establish oblique immediate assaults earlier than they affect the mannequin. This characteristic is at the moment out there in preview in Azure AI Content material Security. Groundness detection is designed to establish text-based hallucinations, together with minor inaccuracies, in mannequin outputs. This characteristic detects “ungrounded materials” in textual content to assist the standard of LLM outputs, Microsoft stated.

Security system messages, also referred to as metaprompts, steer a mannequin’s habits towards secure and accountable outputs. Security evaluations assess an utility’s capacity to jailbreak assaults and to producing content material dangers. Along with mannequin high quality metrics, they supply metrics associated to content material and safety dangers.

Lastly, threat and security monitoring helps customers perceive what mannequin inputs, outputs, and customers are triggering content material filters to tell mitigation. This characteristic is at the moment out there in preview in Azure OpenAI Service.

See also  Microsoft, BlackRock Launch $30B AI Data Center Investment Fund

Copyright © 2024 IDG Communications, .

Source link

TAGGED: generative, Microsoft, safety, security, Tools, unveils
Share This Article
Twitter Email Copy Link Print
Previous Article Apple researchers develop AI that can 'see' and understand screen context Apple researchers develop AI that can ‘see’ and understand screen context
Next Article From AI to HDDs: Data storage predictions for 2024 The changing world of backup
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Your Trusted Source for Accurate and Timely Updates!

Our commitment to accuracy, impartiality, and delivering breaking news as it happens has earned us the trust of a vast audience. Stay ahead with real-time updates on the latest events, trends.
FacebookLike
TwitterFollow
InstagramFollow
YoutubeSubscribe
LinkedInFollow
MediumFollow
- Advertisement -
Ad image

Popular Posts

Cloudflare firewall reacts badly to React exploit mitigation

Throughout the identical window, Downdetector noticed a spike in drawback reviews for enterprise providers together…

December 7, 2025

The Pros and Cons of Public Cloud Storage for Data Center Backups

Knowledge backups can shortly eat massive quantities of cupboard space, particularly when working workloads in…

September 23, 2024

CrowdStrike has a new guidance hub for dealing with the Windows outage

The web page consists of technical info on what prompted the outage, what methods are…

July 21, 2024

CareSuper, Macquarie Partner on Azure Migration

Australian superannuation fund CareSuper has chosen Macquarie Cloud Companies, a enterprise unit of Macquarie Know-how…

July 7, 2025

Google continues to invest in Iowa, with another $1 billion planned for its Council Bluffs campus

Google’s Council Bluffs information heart campus has already seen latest extra spending introduced, with final…

July 8, 2024

You Might Also Like

atNorth's Iceland data centre epitomises circular economy
Cloud Computing

atNorth’s Iceland data centre epitomises circular economy

By saad
How cloud infrastructure shapes the modern Diablo experience 
Cloud Computing

How cloud infrastructure shapes the modern Diablo experience 

By saad
Microsoft ‘Promptions’ fix AI prompts failing to deliver
AI

Microsoft ‘Promptions’ fix AI prompts failing to deliver

By saad
IBM moves to buy Confluent in an $11 billion cloud and AI deal
Cloud Computing

IBM moves to buy Confluent in an $11 billion cloud and AI deal

By saad
Data Center News
Facebook Twitter Youtube Instagram Linkedin

About US

Data Center News: Stay informed on the pulse of data centers. Latest updates, tech trends, and industry insights—all in one place. Elevate your data infrastructure knowledge.

Top Categories
  • Global Market
  • Infrastructure
  • Innovations
  • Investments
Usefull Links
  • Home
  • Contact
  • Privacy Policy
  • Terms & Conditions

© 2024 – datacenternews.tech – All rights reserved

Welcome Back!

Sign in to your account

Lost your password?
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.
You can revoke your consent any time using the Revoke consent button.