《Futuriom:2025 AI网络基础设施的未来走向研究报告(英文版)(28页).pdf》由会员分享,可在线阅读,更多相关《Futuriom:2025 AI网络基础设施的未来走向研究报告(英文版)(28页).pdf(28页珍藏版)》请在三个皮匠报告上搜索。
1、FUTURIOM.COM Cloud Market Trend Report 1 Whats Next for Network Infra for AI|2025 Sponsored by:Cloud Market Trend Report Whats Next for Networking Infrastructure for AI May 2025 FUTURIOM.COM Cloud Market Trend Report 2 Whats Next for Network Infra for AI|2025 Highlights of Whats Next for Networking
2、Infra for AI AI infrastructure needs con0nue to expand.Enterprises are star+ng to adapt large language models(LLMs)to fit their specific business requirements.A mixture of infrastructure will be needed to deliver LLMs for training,as well as specialized models including small language models(SLMs).I
3、nferencing needs will expand infrastructure in a variety of ways.AI inferencing,which enables applica+ons to take input and process output through AI models,will be distributed across cloud and enterprise infrastructure.As models evolve,more inferencing infrastructure will be needed to interpret and
4、 serve up results on a variety of devices and infrastructure from the data center to the edge.Ethernet is gaining ground against InfiniBand.Efforts to shiH AI networks from reliance on NVIDIAs proprietary InfiniBand networking technology is showing drama+c results,with adop+on of Ethernet solu+ons b
5、eing heralded even by NVIDIA as key to inferencing.The Ultra Ethernet Consor0um remains relevant.Efforts by vendors,including NVIDIA,are coalescing around a standard that improves on Ethernets drawbacks and RDMAs limita+ons.Speeds are increasing.While most AI datacenter switches support speeds of 40
6、0-Gb/s,800-Gb/s rates are increasingly on the horizon,with even higher speeds in the works.Op0cal networking is part of AIs future.As AI networks grow in scale,speed,and power requirements,op+cal components will furnish solu+ons that save power,space,and opera+onal costs.AI specialized processors su