If your local datacentre is running a little hotter than usual this month, that could be because its operators are crunching the data they now need to report under the European Union’s (EU) energy efficiency directive.
On 15 September 2024, operators in the EU began handing over information on energy performance and “water footprint” for sites over 500kW. This will help Brussels establish what a “sustainable” datacentre should look like, as well as drive decarbonisation of the grid and better reuse of the heat generated by all that whirring kit.
Depending on your point of view, the regulations are either not a moment too soon, or too late to rein in a thirst for energy that is already out of control.
Datacentres – whether hyperscalers’ giant bit barns, colocation facilities, or regular enterprise operations – are a key consumer of electricity, but their share of overall consumption has remained fairly constant over the past few decades.
However, global electricity demand will grow at 4% this year, the highest annual growth since 2007, according to the International Energy Agency’s (IEA) most recent update.
And datacentres are undoubtedly a contributor to this spike in growth, says the agency. The EU itself estimates they account for almost 3% of European electricity demand, and that this is likely to increase “significantly” in both absolute and relative terms.
“The rise of artificial intelligence [AI] has put the electricity consumption of datacentres in focus, making better stocktaking more important than ever. In many regions, historical estimates of datacentres’ electricity consumption are hampered by a lack of reliable data,” the IEA said.
And, the agency warned: “Future projections include a very wide range of uncertainties related to the pace of deployment, the diverse and expanding applications of AI, and the potential for improvements in energy efficiency.
John Booth, managing director of consultancy Carbon3IT, who contributed to the European Code of Conduct for Datacentres Energy Efficiency (EU DC CoC) and chairs the Data Centre Alliance’s energy efficiency group, says there is clearly a lot for the sector to do.
And the sector hasn’t given the issue its full attention. After all, says Booth, “the Code of Conduct has been out for 15 years, and the direction of travel has clearly been listed and completely ignored”.
He predicts the hyperscalers and colocation players will meet the EU’s requirements, at least in terms of handing over the necessary information.
As Dave Smith, director of sustainability and operational risk at Digital Realty, says: “Energy efficiency is already a core focus for datacentre operators, and regulation helps keep it front and centre. It also drives innovation to meet targets.”
No more firefighting
However, things might get trickier when it comes to companies operating their own, on-premise, datacentres. “These are the people that are constantly firefighting. And they don’t read the trade press. They are the ones I feel are probably in the most need of our assistance and education,” says Smith.
Even banks and large industrial firms could be blissfully unaware of the upcoming regulations, he says. “Unless they are seeking to get some of their capital plant replaced, in which case they will be speaking to the supply chain and the supply chain will be able to bring them up to speed.” This, of course, assumes their suppliers are up to speed themselves.
But Booth says there’s a wider sustainability challenge facing the datacentre world, beyond simple power consumption. “We’re using vast amounts of concrete, we’re using vast amounts of steel for the superstructure.” This means staggering rates of embodied carbon, he says, above and beyond the energy used to power the kit inside.
Moreover, he suggests that much of this is simply redundant.
“We’re probably over-engineering and over-designing our datacentres based on a risk profile from America in the late 1970s, where the power grid wasn’t particularly stable.” Hence an obsession with uptime, bolstered by diesel generators and huge amounts of redundant infrastructure.
That approach to resilience was imported to the UK in the 1980s, along with American banks looking to exploit the big bang in the city, Booth explains.
IT managers have come to “expect datacentres to be cold and to have backup diesel generators”.
But that isn’t always necessary, argues Shawn Novak, chief revenue officer at TECfusions, a US company building out a network of sustainable datacentres.
Speaking at a sustainability session at the Datacloud Global Congress in June, Novak said his firm was doing a lot of “adaptive reuse” of existing facilities. Making existing facilities more efficient is an eco-win in itself. But it also has the benefit that those sites already have access to power – connecting to the grid is one of the big brakes on building out new capacity.
Many of the investors in a new wave of more sustainable datacentres have a background in crypto, Novak added. This will influence how datacentres are designed and built in the future, he predicted.
“They’re all crypto billionaires. They’ve made a lot of money and are starting their own AI companies now. They used to run their shit in a cardboard box with a big ass fan blowing air through. What tier was that?”
John Booth, Carbon3IT
This is in stark contrast to the current over-built, over-engineered industry standard, he said. “What the crypto world has taught us is these things are pretty resilient. The grid is actually pretty stable. Your load that’s running on that GPU is not mission-critical. If this goes down, it goes to another datacentre and it’s still running.”
More immediately, says Booth, the whole notion of uptime needs to be examined when it comes to application stacks, as well as underlying infrastructure.
For most businesses operating their own infrastructure, the only really critical systems are electronic point of sale and logistics, he suggests. Yet most tech leaders have fallen into the trap of classing everything as Tier 3, he says, “when in fact, you probably only need to do three or four applications at Tier 4”.
As for the rest, “you can withstand a little bit of outage on them”.
“How many of your applications in use by your users are truly mission-critical? File and print services being down for eight hours? Okay, fine. Just use your laptop, store it, load it up to the storage when you’re able to,” says Booth.
It burns
Added to that, companies will be running redundant or superseded applications or virtual machines because managers are waiting to see whether the new system falls down. And as teams are broken up or reassigned, these systems are never turned off.
“Everybody forgets the last two elements of Prince: lessons learned and decommission,” he says. “I’ve been to one project meeting where I’ve had a lessons learned wrap-up meeting. I have never been to a decom meeting.”
All of these factors will come into play as the EU – and other regulators worldwide – begin looking more closely at the datacentre sector.
But should we expect the EU and other regulators to pull the plug?
Not immediately. Booth says the new reporting requirements will mean the EU has comprehensive accurate data for the first time, and it will take its time to analyse it.
“I think it’s going to surprise them,” he predicts. “I think at a minimum it’ll be double what they think it is. And it could be up to four times. That’s gonna really put a panic into them.”
Dave Smith, Digital Realty
At which point the regulators will begin taking more drastic action and datacentre operators could find themselves facing a forest of carrots and sticks.
As Digital Realty’s Smith says, how regulations are implemented is really what matters. “If they’re too fragmented or overly complex, they can create a substantial burden on operators, forcing us to hire more people and upskill just to keep up with compliance.”
But it’s not just the regulators datacentre operators will need to worry about. The EU is banking on the publication of efficiency data to drive user decisions, whether they’re using colocation facilities, or simply using cloud services.
Ultimately, says Booth, users “don’t want to be associated with a datacentre that has been identified as being a poorly performing facility”. So, in the long run, those datacentre operators that don’t get their house in order will be feeling the heat from customers. And that could hurt.