AI data centers must support much higher power densities than conventional knowledge facilities: Nvidia’s GB200 NVL72 methods are estimated to devour as much as 120kW per rack, for example, with basic computing infrastructure consuming maybe one-tenth of that.
On prime of that, the AI-enabled knowledge middle will want liquid cooling, superior networking infrastructure, and superior infrastructure administration software program.
And AWS isn’t the one cloud service supplier that’s ramping up its investments into AI-enabled knowledge facilities. Rival cloud service suppliers are all investing in both upgrading or opening new knowledge facilities to seize a bigger chunk of enterprise from builders and customers of large language models (LLMs).
Earlier this 12 months, Microsoft President Brad Smith mentioned the corporate is on monitor to speculate almost $80 billion to construct out AI-enabled knowledge facilities this fiscal 12 months.
Nearly all of the $75 billion in capital expenditure Google will make this 12 months will go towards technical infrastructure together with servers and knowledge facilities, Google CFO Anat Ashkenazi mentioned within the firm’s earnings name this week.
Primarily based on the funding numbers supplied by the three main cloud service suppliers over the past week, AWS is main the pack by round $20 to $25 billion — and is forward, too, of analysts’ forecasts for cloud infrastructure spending.