The fast progress of AI workloads is driving a significant transformation in information middle community infrastructure, with international information middle consultants anticipating a major improve in interconnect bandwidth wants over the subsequent 5 years, in accordance with a examine commissioned by Ciena.
The survey, performed in partnership with Censuswide, queried greater than 1,300 information middle resolution makers throughout 13 international locations. Greater than half (53%) of respondents consider AI workloads will place the most important demand on information middle interconnect (DCI) infrastructure over the subsequent 2-3 years, surpassing cloud computing (51%) and large information analytics (44%).
To satisfy surging AI calls for, 43% of latest information middle amenities are anticipated to be devoted to AI workloads. With AI mannequin coaching and inference requiring unprecedented information motion, information middle consultants predict a large leap in bandwidth wants. As well as, when requested concerning the wanted efficiency of fiber optic capability for DCI, 87% of contributors consider they are going to want 800 Gb/s or greater per wavelength.
“AI workloads are reshaping all the information middle panorama, from infrastructure builds to bandwidth demand,” mentioned Jürgen Hatheier, Chief Know-how Officer, Worldwide, Ciena. “Traditionally, community site visitors has grown at a price of 20-30% per 12 months. AI is about to speed up this progress considerably, which means operators are rethinking their architectures and planning for a way they will meet this demand sustainably.”
Creating Extra Sustainable AI-Pushed Networks
Survey respondents verify there’s a rising alternative for pluggable optics to help bandwidth calls for and handle energy and area challenges. In accordance with the survey, 98% of information middle consultants consider pluggable optics are necessary for decreasing energy consumption and the bodily footprint of their community infrastructure.
Distributed Computing
The survey discovered that, as necessities for AI compute proceed to extend, the coaching of Massive Language Fashions (LLMs) will turn into extra distributed throughout totally different AI information facilities. In accordance with the survey, 81% of respondents consider LLM coaching will happen over some degree of distributed information middle amenities, which would require DCI options to be related to one another. When requested about the important thing components shaping the place AI inference will likely be deployed, the respondents ranked the next priorities:
· AI useful resource utilization over time is the highest precedence (63%)
· Decreasing latency by putting inference compute nearer to customers on the edge (56%)
· Information sovereignty necessities (54%)
· Providing strategic areas for key clients (54%)
Fairly than deploying darkish fiber, the bulk (67%) of respondents anticipate to make use of Managed Optical Fiber Networks (MOFN), which make the most of carrier-operated high-capacity networks for long-haul information middle connectivity.
“The AI revolution isn’t just about compute—it’s about connectivity,” added Hatheier. “With out the suitable community basis, AI’s full potential can’t be realized. Operators should guarantee their DCI infrastructure is prepared for a future the place AI-driven site visitors dominates.”
