Green data centres are perhaps best described as ‘work in progress’. But the drive towards greener data centres, vital for environmental sustainability, may just have taken a huge step forward this month.

Enter artificial intelligence

Last week, data science researchers at Lancaster University announced that they had managed to develop artificial intelligence software for servers. This has the potential to change the way that data centres run, and reduce energy requirements.  The new software system is able to reorganise its components to make itself more efficient, depending on what the server is doing. It can also learn from previous experience, to improve its performance.

Perhaps the biggest advantage is that it can remove people from the equation. Instead of having to report to a human, who then has to react by making a change, this system is self-governing. With the huge number of fluctuations in demand for data centres that occur every day, this is big. This will make adjusting conditions in data centres both quicker and, in the long term, much more efficient.

In the shorter term, however, cooling remains one of the biggest issues for data centres that wish to become greener. While legacy systems used coolant gases, current trends suggest that indirect or direct fresh air cooling is significantly cheaper and greener. Some data centre providers have even moved to Iceland, but most are remaining in more temperate climes, and using water- or air-cooled options, often with heat exchange systems.

More significantly, perhaps, new servers now have a much greater tolerance of temperature variation. In particular, the myth that servers need an ambient temperature of no more than 70F has been exposed as just that: a myth. Higher temperatures can be tolerated on a routine basis. This, in itself, reduces the need for cooling, and therefore its cost.

Honesty is the best policy

There is, however, a ‘dark side’ to greening data centres. As consumers become savvier about the importance of power usage effectiveness (PUE) and demand more from data centre operators, so the pressure is on to report ever lower figures. And, of course, the easiest way to do that is to change the way that you measure or report your figures. While there are measurement standards, there is still a wide range of options for how to measure that is within these standards.

As an example, consider the timing of your PUE measurement. Take your measurements consistently at times when temperatures are low, and so is demand—early morning, for example—and your values will be better. Measure only your newest premises, or make your assessment in winter, and again, your PUE will be less.

But of course, if you focus on reducing your PUE by adjusting measurements, you may lose track of the real focus—actual reductions over time. Those operators that have focused on this—Google springs to mind—have managed to achieve an average PUE across all sites of 1.12, down from around 1.2 when figures were first reported. The lesson? Measuring is the first step towards managing, but you also need to focus on what matters.

 

Adding Big Data to the equation

Google’s ‘best practices’ for improving energy efficiency also include turning the thermostat up, good design to optimise containment, and optimising power distribution. All these can be made easier to manage by Big Data and analytics.

In particular, using analytics removes the need to guess at optimum temperatures, power distributions and so on. Using data lake technology means better visibility of what is really happening in the data centre, and therefore improved control. Real-time analysis allows tiny adjustments to be made continuously, including drawing in external air to provide cooling when the temperature allows.

Moving rapidly forwards

Data centre analytics may still be at an early stage, but specialist providers have been around for a while. Back in 2014, Power Analytics shipped the first orders of its then-new data centre optimisation solution, designed to work with existing systems to improve efficiency and operations, and even integrate billing.

Power Analytics may have been ahead of the game, but there is no doubt that others are rapidly colonising the space. Lancaster University’s artificial intelligence solution may be the first genuine use of this type of technology in data centre optimisation, but we predict that it won’t be the last.

 

Leave a Reply

Your email address will not be published. Required fields are marked *