Badly designed, poorly laid out and under pressure the data centre is on the verge of becoming the environmental bad boy, writes ANDREW OLDFIELD, director of Emerging Markets for Panduit South Africa.
It is estimated that two percent of all electricity production globally is now consumed by IT with the United States sitting at nearer to three percent and the demands of the data set to rise substantially over the next few years. The IT sector is responsible for around two percent of global carbon emissions and the data centre is now using up around 14% of that on their own. Today it is vital that the outdated data centre be replaced with lean, green machines that use up less space and energy while delivering increased performance.
Facebook made the news regularly in 2013 for its open reporting of energy consumption2 and its shift towards a green solution. Their use of hydro power in their Swedish plant, their placement of centres in cooler climates and their focus on a greener design have seem them work hard to limit their growing impact. And it is growing recent figures released by the social giant1 saw its carbon footprint increase by 52% in 2012 and a greenhouse gas emissions by around 34%.
So what can be done to change these issues? The answer lies in the parts that make up the intricate workings of the modern data centre, not in the whole. The average data centre consumes over 50 times more energy than its equivalent in average office space and this consumption is spread out at 40% on IT and 60% on cooling, power and lighting. Cooling, in itself, takes up an average of 35% of this amount, with power such as UPS and distribution taking up around 20%.
The IT load is the power needed to operate the network, the computing and the storage devices and cooling demands are directly impacted by this load. The more heat it generates, the more pressure is placed on the cooling system to keep the temperature down and the efficiency up. Fortunately this can be mitigated to a certain extent by design, creating a solution built to optimum specifications.
So much relies on the way that the data centre has been laid out. The use of cold and hot aisle containment can localise the cooling needed by the system to a smaller area, thereby controlling the amount of power used to keep it at the right temperature. In addition, rack space should be used more efficiently as that cuts down on the footprint, and the data centre should be split into halls that are only equipped when absolutely necessary.
If the centre cools only what is necessary within the area and the running temperature is lifted by a couple of degrees, each degree raised has the potential to save more than just the planet: they can reduce energy costs by as much as four percent each. Many data centre experts can work within the existing limitations of a data centre to create solutions that have a significant impact on the cooling requirements, for example, you can use cooler air from outside the building at night or in colder seasons and this can potentially save on demand by up to 70%, depending on your location. A cold winter’s night in South Africa will not be as chilly as perhaps the Facebook data centre in the Arctic, but it will still be far better than using unnecessary power loads.
Of course, it may seem obvious, but many data centres don’t apply the basics. Seal it up tightly with blanking panels on the racks and use efficient cable management so that the flow of air is directed, allowing for the air to circulate more effectively with the hot air removed quickly and without blending with the colder air.
It isn’t just the cooling and cables that place the data centre in the eye of the environmental storm. The equipment stacked up within these chilly rooms has to be as energy efficient as possible in order to minimise impact and increase efficiency. It’s in the organisation’s best interests to use energy efficient network equipment such as power supply units that are designed with power reduction on low load capacity and can operate at higher inlet temperatures.
And then, of course, servers should be virtualised as a 60% reduction in servers represents a saving of around 40% in energy, and flash and virtualised storage are superb alternatives to managing storage and energy demands.
When you look at some of the extraordinary figures of the world’s largest data centres, the ones driven by Facebook, Amazon and Google, you can see why it is vital that the world step up with solutions that can control the impact of the data centre and its carbon footprint. The IDC released a report in 2013 that said the average age of the data centre is around nine years old with Gartner adding to that by saying that any data centre over seven years old is obsolete. A revitalised data centre does more than just cost money: it improves operational efficiency, cuts long term costs and delivers better capacity. It’s about time they were given a make-over‚Ķ
*