Thursday, 30 Apr 2026
Subscribe
logo
  • AI Compute
  • Infrastructure
  • Power & Cooling
  • Security
  • Colocation
  • Cloud Computing
  • More
    • Sustainability
    • Industry News
    • About Data Center News
    • Terms & Conditions
Font ResizerAa
Data Center NewsData Center News
Search
  • AI Compute
  • Infrastructure
  • Power & Cooling
  • Security
  • Colocation
  • Cloud Computing
  • More
    • Sustainability
    • Industry News
    • About Data Center News
    • Terms & Conditions
Have an existing account? Sign In
Follow US
© 2022 Foxiz News Network. Ruby Design Company. All Rights Reserved.
Data Center News > Blog > AI & Compute > Small models as paralegals: LexisNexis distills models to build AI assistant
AI & Compute

Small models as paralegals: LexisNexis distills models to build AI assistant

Last updated: March 21, 2025 12:14 pm
Published March 21, 2025
Share
Small models as paralegals: LexisNexis distills models to build AI assistant
SHARE

Be part of our each day and weekly newsletters for the newest updates and unique content material on industry-leading AI protection. Study Extra


When authorized analysis firm LexisNexis created its AI assistant Protégé, it needed to determine the easiest way to leverage its experience with out deploying a big mannequin. 

Protégé goals to assist legal professionals, associates and paralegals write and proof authorized paperwork and be certain that something they cite in complaints and briefs is correct. Nonetheless, LexisNexis didn’t desire a normal authorized AI assistant; they needed to construct one which learns a agency’s workflow and is extra customizable. 

LexisNexis noticed the chance to deliver the ability of enormous language fashions (LLMs) from Anthropic and Mistral and discover the perfect fashions that reply consumer questions the perfect, Jeff Riehl, CTO of LexisNexis Authorized and Skilled, advised VentureBeat.

“We use the perfect mannequin for the precise use case as a part of our multi-model method. We use the mannequin that gives the perfect outcome with the quickest response time,” Riehl mentioned. “For some use circumstances, that will probably be a small language mannequin like Mistral or we carry out distillation to enhance efficiency and cut back price.”

Whereas LLMs nonetheless present worth in constructing AI functions, some organizations flip to utilizing small language fashions (SLMs) or distilling LLMs to turn out to be small variations of the identical mannequin. 

Distillation, the place an LLM “teaches” a smaller mannequin, has turn out to be a preferred methodology for a lot of organizations. 

Small fashions usually work finest for apps like chatbots or easy code completion, which is what LexisNexis needed to make use of for Protégé. 

See also  Rebuilding Alexa: How Amazon is mixing models, agents and browser-use for smarter AI

This isn’t the primary time LexisNexis constructed AI functions, even earlier than launching its authorized analysis hub LexisNexis + AI in July 2024.

“We have now used a whole lot of AI previously, which was extra round pure language processing, some deep studying and machine studying,” Riehl mentioned. “That basically modified in November 2022 when ChatGPT was launched, as a result of previous to that, a whole lot of the AI capabilities had been form of behind the scenes. However as soon as ChatGPT got here out, the generative capabilities, the conversational capabilities of it was very, very intriguing to us.”

Small, fine-tuned fashions and mannequin routing 

Riehl mentioned LexisNexis makes use of totally different fashions from a lot of the main mannequin suppliers when constructing its AI platforms. LexisNexis + AI used Claude fashions from Anthropic, OpenAI’s GPT fashions and a mannequin from Mistral. 

This multimodal method helped break down every process customers needed to carry out on the platform. To do that, LexisNexis needed to architect its platform to change between fashions. 

“We’d break down no matter process was being carried out into particular person elements, after which we might determine the perfect massive language mannequin to assist that element. One instance of that’s we are going to use Mistral to evaluate the question that the consumer entered in,” Riehl mentioned. 

For Protégé, the corporate needed sooner response instances and fashions extra fine-tuned for authorized use circumstances. So it turned to what Riehl calls “fine-tuned” variations of fashions, primarily smaller weight variations of LLMs or distilled fashions. 

See also  Hugging Face brings ‘Pi-Zero’ to LeRobot, making AI-powered robots easier to build and deploy

“You don’t want GPT-4o to do the evaluation of a question, so we use it for extra subtle work, and we change fashions out,” he mentioned. 

When a consumer asks Protégé a query a couple of particular case, the primary mannequin it pings is a fine-tuned Mistral “for assessing the question, then figuring out what the aim and intent of that question is” earlier than switching to the mannequin finest suited to finish the duty. Riehl mentioned the following mannequin may very well be an LLM that generates new queries for the search engine or one other mannequin that summarizes outcomes. 

Proper now, LexisNexis largely depends on a fine-tuned Mistral mannequin although Riehl mentioned it used a fine-tuned model of Claude “when it first got here out; we’re not utilizing it within the product as we speak however in different methods.” LexisNexis can also be fascinated about utilizing different OpenAI fashions particularly for the reason that firm got here out with new reinforcement fine-tuning capabilities final 12 months. LexisNexis is within the technique of evaluating OpenAI’s reasoning fashions together with o3 for its platforms. 

Riehl added that it might additionally have a look at utilizing Gemini fashions from Google. 

LexisNexis backs all of its AI platforms with its personal data graph to carry out retrieval augmented technology (RAG) capabilities, particularly as Protégé might assist launch agentic processes later. 

The AI authorized suite

Even earlier than the appearance of generative AI, LexisNexis examined the potential for placing chatbots to work within the authorized {industry}. In 2017, the corporate examined an AI assistant that will compete with IBM’s Watson-powered Ross and Protégé sits within the firm’s LexisNexis + AI platform, which brings collectively the AI companies of LexisNexis. 

See also  OpenAI returns old models to ChatGPT amid ‘bumpy’ GPT-5 rollout

Protégé helps regulation corporations with duties that paralegals or associates are inclined to do. It helps write authorized briefs and complaints which might be grounded in corporations’ paperwork and knowledge, counsel authorized workflow subsequent steps, counsel new prompts to refine searches, draft questions for depositions and discovery, hyperlink quotes in filings for accuracy, generate timelines and, after all, summarize advanced authorized paperwork. 

“We see Protégé because the preliminary step in personalization and agentic capabilities,” Riehl mentioned. “Take into consideration the various kinds of legal professionals: M&A, litigators, actual property. It’s going to proceed to get increasingly personalised based mostly on the precise process you do. Our imaginative and prescient is that each authorized skilled may have a private assistant to assist them do their job based mostly on what they do, not what different legal professionals do.”

Protégé now competes towards different authorized analysis and expertise platforms. Thomson Reuters personalized OpenAI’s o1-mini-model for its CoCounsel authorized assistant. Harvey, which raised $300 million from buyers together with LexisNexis, additionally has a authorized AI assistant. 


Source link
TAGGED: assistant, Build, distills, LexisNexis, models, paralegals, small
Share This Article
Twitter Email Copy Link Print
Previous Article Leadership change at Green Mountain Leadership change at Green Mountain
Next Article Illustration of a globe representing the announcement from Anthropic that its AI chatbot and virtual assistant Claude can now search the web, providing users with more up-to-date and relevant responses. Anthropic’s AI assistant Claude learns to search the web
Leave a comment

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Your Trusted Source for Accurate and Timely Updates!

Our commitment to accuracy, impartiality, and delivering breaking news as it happens has earned us the trust of a vast audience. Stay ahead with real-time updates on the latest events, trends.
FacebookLike
TwitterFollow
InstagramFollow
YoutubeSubscribe
LinkedInFollow
MediumFollow
- Advertisement -
Ad image

Popular Posts

Dell’Oro Group raises market forecast for liquid cooling and rack power distribution

“We raised our outlook on account of three components,” stated Tam Dell’Oro, Founding father of…

February 6, 2025

Agentic AI from Basware is just the beginning

A survey performed on behalf of Basware discovered that 61% of organisations had deployed AI…

February 25, 2026

Uptime Institute’s 15th Annual Global Data Center Survey 2025: Challenges and innovations

The Uptime Institute has launched its fifteenth Annual World Information Heart Survey 2025, a pivotal…

August 4, 2025

Pilot Photonics and Finchetto collaborate on next-gen data centre switches

Pilot Photonics, an Irish built-in lasers agency, has entered a partnership with Finchetto, an organization…

March 24, 2026

NVIDIA and Google infrastructure cuts AI inference costs

On the Google Cloud Subsequent convention, Google and NVIDIA outlined their {hardware} roadmap designed to…

April 23, 2026

You Might Also Like

STL launches Neuralis data centre connectivity suite in the U.S.
AI & Compute

STL launches Neuralis data centre connectivity suite in the U.S.

By saad
What is optical interconnect and why Lightelligence's $10B debut says it matters for AI
AI & Compute

What is optical interconnect and why Lightelligence’s $10B debut says it matters for AI

By saad
IBM launches AI platform Bob to regulate SDLC costs
AI & Compute

IBM launches AI platform Bob to regulate SDLC costs

By saad
The evolution of encoders: From simple models to multimodal AI
AI & Compute

The evolution of encoders: From simple models to multimodal AI

By saad

About Us

Data Center News is your dedicated source for data center infrastructure, AI compute, cloud, and industry news.

Top Categories

  • AI & Compute
  • Cloud Computing
  • Power & Cooling
  • Colocation
  • Security
  • Infrastructure
  • Sustainability
  • Industry News

Useful Links

  • Home
  • Contact
  • Privacy Policy
  • Terms & Conditions

Find Us on Socials

© 2026 Data Center News. All Rights Reserved.

© 2026 Data Center News. All Rights Reserved.
Welcome Back!

Sign in to your account

Lost your password?
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it.
You can revoke your consent any time using the Revoke consent button.