Be part of Gen AI enterprise leaders in Boston on March 27 for an unique evening of networking, insights, and conversations surrounding knowledge integrity. Request an invitation right here.
Pure Storage, a pioneer in delivering superior knowledge storage know-how and companies, unveiled a set of latest AI infrastructure options and validated reference architectures developed in partnership with Nvidia finally week’s GTC convention. The announcement aimed to supply enterprises with confirmed frameworks to handle the high-performance knowledge and compute necessities crucial for profitable AI deployments.
Miroslav Klivansky, the International Apply Chief for AI and Analytics at Pure Storage, not too long ago mentioned with VentureBeat the corporate’s strengthened partnership with Nvidia and its potential influence on the adoption of synthetic intelligence by enterprises.
The partnership between Pure Storage and Nvidia comes at a essential juncture, as enterprises throughout varied industries are increasingly adopting AI to drive innovation, optimize operations, and achieve a aggressive benefit; nevertheless, nearly all of AI deployments are at present scattered throughout fragmented knowledge environments, which might hinder the complete realization of AI’s potential. Pure Storage and Nvidia’s collaboration seeks to deal with this problem head-on.
“Pure Storage has had a long-standing partnership with Nvidia,” Klivansky informed VentureBeat. “In 2018, in collaboration with Nvidia, we launched AIRI, the primary AI-ready infrastructure reference structure, purpose-built to allow organizations to realize higher utilization and uptime and finally scale their AI investments with out complexity. Since then, Pure Storage was among the many first enterprise knowledge storage distributors to turn out to be Nvidia DGX BasePOD licensed.”
VB Occasion
The AI Impression Tour – Atlanta
Request an invitation
Simplifying the AI puzzle with validated reference architectures
Klivansky emphasised the significance of their collaboration with Nvidia, stating, “Right now’s announcement is a testomony to this ongoing, collaborative relationship with Nvidia to help enterprise AI deployments. As a frontrunner in AI, Pure Storage, in collaboration with Nvidia, is arming international prospects with confirmed options to deal with the high-performance knowledge and compute necessities they should drive profitable AI deployments with quicker mannequin coaching and inference, with larger operational effectivity, plus decrease complete value and vitality necessities.”
One of many key options unveiled at GTC was a Retrieval Augmented Technology (RAG) Pipeline for AI Inference, which mixes Nvidia GPUs for compute and Pure Storage’s all-flash enterprise storage. “Utilizing Pure Storage’s knowledge storage infrastructure for retrieval-augmented technology, prospects can improve commonplace giant language fashions (LLMs) with customized company knowledge to supply larger relevance, accuracy, and foreign money to chatbots and keep away from the large funding of constructing their very own customized giant language fashions,” defined Klivansky.
Increasing selection with Nvidia OVX server storage certification
One other important improvement introduced at GTC was Pure Storage’s achievement of Nvidia OVX Server Storage certification. This certification supplies prospects with versatile storage reference architectures validated in opposition to key benchmarks, making certain a strong infrastructure basis for value and performance-optimized AI {hardware} and software program options.
“With the most recent Nvidia OVX Server Storage validation, we are able to now present enterprise prospects and channel companions with versatile storage reference architectures, validated in opposition to key benchmarks to supply a robust infrastructure basis for value and performance-optimized AI {hardware} and software program options,” mentioned Klivansky. “OVX computing platforms are powered by Nvidia L40s GPUs, which implies our prospects now have their selection of validated reference architectures from Pure Storage with larger compute GPU platform choices and availability.”
Klivansky highlighted the distinctive benefits of Pure Storage’s resolution, stating, “What differentiates Pure’s resolution from aggressive options is Pure’s knowledge storage platform, optimized for AI workloads. Pure’s FlashBlade//S supplies a mixture of main efficiency, operational effectivity, vitality and price financial savings, and a dependable, future-proof storage basis that grows with altering AI knowledge necessities.”
Vertical-specific AI options beginning with finance
Along with the broader AI infrastructure options, Pure Storage and Nvidia are additionally specializing in growing vertical-specific AI purposes. The primary of those is a Monetary RAG utilizing FinGPT, designed particularly for the finance {industry}.
Klivansky highlighted the prevalence of finance industry-specific AI use circumstances in giant enterprises, stating, “Generative AI utilizing retrieval augmented technology (RAG) and pre-trained LLMs can be utilized to create monetary analyses which are extra correct, real-time/quicker, and extra cost-efficient than these created by people since they’re able to crunch much more info at a a lot greater price.”
The event of vertical-specific AI options demonstrates Pure Storage and Nvidia’s dedication to addressing the distinctive wants of various industries. By offering tailor-made AI purposes, the businesses intention to speed up the adoption of AI in sectors comparable to finance, healthcare, and the general public sector.
Increasing the AI associate ecosystem
Pure Storage’s dedication to assembly the evolving knowledge storage wants of AI is additional evidenced by its increasing associate ecosystem. The corporate has cast partnerships with main AI platform suppliers like Run.AI and Weights & Biases, in addition to channel companions comparable to WWT, ePlus, CDW, and Perception.
“We’re excited to develop funding into our AI associate ecosystem with companions like Run.AI and Weights & Biases,” mentioned Klivansky. “Run.AI enhances Pure’s storage efficiency and vitality effectivity worth for AI infrastructure by optimizing using GPU-powered compute assets to speed up improvement. Weights & Biases supplies an easy-to-use framework for mannequin constructing and RAG pipelines — which augments Pure’s knowledge storage platform for AI, additional enriching operational effectivity and accelerating end-to-end ML workflows.”
The collaboration with these companions allows Pure Storage to supply a complete and built-in AI resolution stack, simplifying the deployment and administration of AI infrastructure for enterprises. By leveraging the experience of those companions, Pure Storage can ship a seamless and optimized AI expertise, from knowledge storage to mannequin improvement and deployment.
Enabling sustainable AI with energy-efficient storage
Because the adoption of AI continues to develop, the environmental influence of knowledge facilities has turn out to be a big concern. Pure Storage acknowledges this problem and has designed its storage platform to assist prospects cut back their vitality consumption and carbon footprint.
“Pure has designed and constructed its platform to permit prospects to dramatically lower their vitality and carbon footprints,” defined Klivansky. “Pure Storage’s flash-optimized techniques usually use between 2x and 5x much less energy than aggressive SSD-based techniques and between 5x and 10x much less energy than the exhausting disk techniques we substitute.”
By offering energy-efficient storage options, Pure Storage allows enterprises to construct extra sustainable AI infrastructure. This not solely helps organizations meet their environmental objectives but in addition ends in important value financial savings when it comes to energy, cooling, and knowledge heart area.
Paving the way in which for the way forward for enterprise AI
As Klivansky said, “Pure Storage anticipated the wants and potential of AI on the inception of this newest wave, delivering the high-performance, environment friendly, and container-ready storage platform the {industry} wanted to capitalize on and actually derive worth from the large quantities of knowledge used to gas this know-how.”
With its confirmed observe file in delivering superior knowledge storage options and its dedication to innovation, Pure Storage is well-positioned to play a pivotal position within the ongoing AI revolution. The corporate’s dedication to assembly the distinctive knowledge storage necessities of AI workloads is obvious in its steady efforts to develop cutting-edge options and forge strategic partnerships.
“Over the previous years, Pure Storage has been on the forefront of innovation to satisfy the escalating knowledge storage wants of AI deployments,” mentioned Klivansky. “Recognizing the problem posed by fragmented knowledge environments, Pure Storage pioneered an enterprise knowledge storage platform to satisfy the distinctive wants of AI. By providing a easy, dependable, and environment friendly storage platform, Pure Storage allows enterprises to totally harness the potential of AI whereas mitigating dangers, decreasing prices, and minimizing vitality consumption.”
The collaboration between Pure Storage and Nvidia will not be solely a testomony to their shared imaginative and prescient for the way forward for AI but in addition a mirrored image of their dedication to enabling enterprises to unlock the complete potential of this transformative know-how. By offering a complete and validated AI infrastructure resolution, the businesses are serving to organizations throughout varied industries to speed up their AI adoption journey and achieve a aggressive edge in an more and more data-driven world.