Qualcomm has unveiled two new AI accelerator chips for the booming knowledge heart market, taking direct purpose at GPU king Nvidia’s AI market dominance. The corporate additionally secured Saudi Arabia’s Humain as its first buyer for the brand new chips.
The semiconductor firm, which has so far targeted on chips aimed toward cell and wi-fi units, mentioned its AI200 and AI250 chips will ship rack-scale efficiency with a brand new reminiscence structure for enhanced AI inference at decrease prices. The brand new chips can be commercially obtainable in 2026 (for the AI200) and 2027 (for the AI250).
Booming AI demand has spurred a world race to outfit knowledge facilities with extra AI processing energy. In line with analysis agency MarketsandMarkets, the worldwide AI knowledge heart market is projected to develop from $236 billion in 2025 to greater than $933 billion by 2030. Nvidia holds a 92% share of the present knowledge heart market, in response to IoT Analytics.
A Problem for AI Chip Chief?
Most of Nvidia’s dominance has come from AI coaching, the place its high-powered GPUs are the popular {hardware} for dealing with these workloads. Nvidia is on monitor to generate greater than $180 billion in income from knowledge heart operations this 12 months.
However specialists see a chance to problem Nvidia on the subject of inference, as compute wants will shift. Qualcomm’s new chips will mix Oryon CPUs, Hexagon NPU acceleration, and LPDDR reminiscence together with liquid cooling, scaling over PCIe and Ethernet.
“Qualcomm is critical about knowledge heart inference effectivity,” Patrick Moorhead, chief analyst and CEO of Moor Insights & Technique, mentioned in a LinkedIn publish. “If it executes, it might evolve from being identified for cell and edge effectivity to changing into a frontrunner in rack-scale AI performance-per-watt – an enormous shift in how the market sees Qualcomm’s position within the broader AI ecosystem.”
Banking on Inference
Qualcomm mentioned the brand new chips can be a part of a multi-generational knowledge heart AI inference roadmap. The corporate’s AI software program stack helps machine studying frameworks, inference engines, and generative AI frameworks, together with inference optimization methods like disaggregated serving.
Matt Kimball, vp and principal analyst at Moor Insights and Technique informed DCN that Qualcomm’s inference play is a great technique.
“It is a daring transfer from Qualcomm that validates the inference market alternative,” he mentioned in an e mail interview. “Qualcomm is an organization that has been very good concerning the markets it chooses to enter – and when it chooses to enter these markets. So, this tells me we’re about to see inference market progress speed up… I believe there’s a longer-term enterprise play right here for the corporate as nicely… sooner or later, these rack-scale programs are going to seek out their approach into the enterprise.”
Durga Malladi, Qualcomm’s senior vp and basic supervisor for expertise planning, edge options and knowledge heart, mentioned the options supply price and adaptability benefits over opponents.
“We’re redefining what’s potential for rack-scale AI inference,” he mentioned in a press release, including that the corporate’s software program stack and open ecosystem assist “make it simpler than ever for builders and enterprises to combine, handle, and scale already skilled AI fashions on our optimized AI inference options.”
Wall Road appeared to welcome the information. Qualcomm shares rose greater than 20% in Monday buying and selling, its largest rally since 2019.
Saudi Arabia’s AI startup Humain plans to deploy 200 megawatts value of the brand new chips beginning in 2026, Qualcomm mentioned.
Daniel Newman, analyst and CEO at Futurum Group, mentioned Qualcomm’s new AI chips will “catapult” the corporate into the AI arms race. “We see this as an enormous inflection with greater than $10 billion in potential income for the corporate over the following few years and vital upside if it executes in key markets,” he wrote in a LinkedIn post.
