The Art Of Data Centre Cooling

102 19
Data centre cooling is of the utmost importance to the modern business. HP state that their servers start to fail after they hit and maintain a temperature of 30 degrees Celsius which is easy to do in a data centre where theres a lot of power flowing around. If servers reach 60 degrees, the whole system will shut down causing loss of all unsaved data and potential service disruption for both customer facing and internal business applications.
It is possible to lose orders during this time not only as a result of the system being down at the outset, but as a result of orders in transit at the moment the data centre goes down. When your data centre crashes completely, all data transaction in mid-flight will vanish, so you can theoretically lose thousands of pounds worth of orders because your data centre cooling processes werent good enough.
And if you lose those orders, you never know what you have lost because the vanished data leaves no records behind. So not only do you not complete the order you potentially end up in a situation where you have to refund money you have already received, or enter into disputes over refunding money the customer has spent but your system has no record of. Either way the resultant customer service reputation is not likely to be good particularly when coupled with the buzz created by all the people who thought they had ordered something from you but never got it.
Fortunately, modern data centre cooling methods can be installed in many data centres with the minimum of disruption and maximum dispatch. The art of data cooling is really very simple!
The trick is to separate hot air and cold air. This is called air containment, and it allows you to keep air at a predictable even temperature around your server intakes. The internal aisle of your data centre is sealed, containing cold air. And any side intakes are fed by cold air ducts.
The cold air in this data centre cooling method comes from under the floor. The data centre is raised on a false floor, and underneath a constant current of very cold air is pumped. Hot air is vented out the back of the data centre and then sucked down into the cooler to be recycled as cold air. So the system is self-perpetuating, requiring little energy to keep itself going.
Because of this, modern data centre cooling is as energy efficient as it is rapid. The carbon footprint of the business falls as its server and communications hardware efficiency increases. With less downtime, less strain on the servers and more productivity, the solution keeps the business running at maximum efficiency.

Leave A Reply

Your email address will not be published.