Energy use by data centers is an increasing concern. From powering the servers themselves to cooling the facilities, these establishments are likely to be an increasing drain on the power grid. Data centers account for anywhere from 1% to 3% of global energy use.
Numerous efficiencies have been devised, from reducing the CPU of servers that are not in use to siting data centers in regions less likely to need artificial cooling to altering the architecture of the buildings that house the equipment. Still, it is estimated that these approaches may only reduce energy consumption by around 10%.
Now, some companies are putting their data centers under the sea. Microsoft staged the first large-scale underwater data center experiment beginning in 2015 with its Project Natick initiative. And several smaller companies have followed suit and have actually begun marketing their services.
Some terrestrial data centers, such as one operated by Interxion in Stockholm, have already experimented with the use of seawater to cool their operations. And immersion cooling is gaining traction – though it is still energy-intensive to circulate contained pools of water.
While actually putting data centers underwater may sound like a gimmick, this seemingly impractical approach has much to recommend it.