Apparently, AI data centers are capable of sucking less (power, that is). A recent UK trial demonstrated that they can adjust their energy demands dynamically without disrupting critical workloads. This contrasts with data centers' current approach of always-on power draw, which can strain grids and drive up prices for everyone.Over five days in December 2025, more than 200 simulated "grid events" tested a London data center’s ability to adjust its energy use on the fly. The trial used software from Emerald AI, which was involved in the study. Other partners included NVIDIA, National Grid, Nebius and the nonprofit Electric Power Research Institute.In each simulated grid event, the data center successfully adjusted its energy use to the requested level. It reduced power dra [...]
On a recent work trip, I had plenty of things to worry about — but being able to recharge my two smartphones, laptop and iPad were not among my concerns. In my carry-on luggage, I had two medium-cap [...]
At the start of the month, Elon Musk announced that two of his companies — SpaceX and xAI — were merging, and would jointly launch a constellation of 1 million satellites to operate as orbital d [...]
OpenAI has struck a deal with Oracle to add an astounding 4.5 gigawatts of US data center capacity to power the massive workload required by its large language models. The companies haven't speci [...]