Guest Blogger: Steph Johnson, Head of North America – Aspectus PR
What better timing than the first 90 degree day of summer to discuss the latest round of heat-related data center drama?
As new prop shops with exabyte data crunching requirements spring forth in response to upcoming Volcker Rule legislation and financial firms continue to coax and cajole increased computational performance from their data centers with GPUs and multi-core processors, the days of spraying down the windows of your data center with cold water are over.
High-frequency trading firms and quantitative hedge funds gain advantages by holding short-term positions in equities, options, futures, ETFs, currencies and other financial instruments that possess the capability to trade electronically. The Monte-Carlo simulations and other high repetition quantitative scenarios that help them make trading decisions require enormous processing power and the number one problem that can derail even the most advanced computing technology is heat.
Add to the mix analytics vendors making their own quantum leaps in processing and analytic speeds and financial firms over-clocking CPUs to achieve better performance, and you have turned your data center into one hot tamale. The financial industry is starting to see that green initiatives are more than just charitable – they are necessary for survival.
Drop It Like It’s Hot
Whether the headache belongs to the bank or the third party provider hosting its services in the cloud, the data center burden is simply being transferred from one party to another. The pain is real and the hangover will be worse. As financial firms become more comfortable outsourcing the management of proprietary data, companies like Amazon and Google are struggling to keep their shared pools of configurable computing resources available, secure and not hitting performance walls.
Data center providers are exploring things like thermal energy storage which reduces costs by letting companies run air conditioning systems at night – when power rates are cheaper and the demands on the grid are lower. Google has released their own “best practices” for data center management:
- Measure PUE
You can’t manage what you don’t measure, so you should characterize your data center’s efficiency performance by measuring energy use. Google uses a ratio called PUE – Power Usage Effectiveness – to help reduce energy used for non-computing, such as cooling and power distribution. It is important to measure often and capture energy data over an entire year because seasonal weather variations can have a notable effect.
- Manage airflow
Good air flow management is fundamental to efficient data center operation and a little analysis can pay big dividends. Start with minimizing hot and cold air mixing by using well-designed containment and eliminating hot spots. Consider using thermal modeling with computational fluid dynamics (CFD) to help quickly characterize and optimize air flow.
- Adjust the thermostat
Raising the cold aisle temperature will reduce facility energy use. Don’t try to run your cold aisle at 70F; set the temperature at 80F or higher. For facilities using economizers, running elevated cold aisle temperatures is critical as it enables more days of free cooling and more energy savings.
- Use free cooling
Free cooling is removing heat from your facility without using the chiller. This is done by using low temperature ambient air, evaporating water or using a large thermal reservoir. Chillers are the dominant energy using component of the cooling infrastructure and minimizing their use is typically the largest opportunity for savings.
- Optimize power distribution
Minimize power distribution losses by eliminating as many power conversion steps as possible. For the conversion steps, be sure to specify efficient equipment transformers and power distribution units (PDUs). One of the largest losses in data center power distribution is from the uninterruptible power supply (UPS); be sure to specify a high efficiency model.
What the Financial Tech Sector Can Teach the Government
Some innovative financial technology providers are using liquid submersion cooling technology to remove heat and developing non-conductive, non-volatile, biodegradable core coolants which can eliminate the need for fans, ducts and breathing room. The goal is to improve speed, performance, reliability and latency while reducing environmental impact.
In stark contrast to power saving innovations in the financial space, the US government is currently building a 25,000 square foot data center in the foothills of Utah’s Wasatch Mountain range. Set to be ready to crunch yottabytes of data by September 2013, it will have a massive air-conditioning system that will draw power from its own nearby substation to support its 65-megawatt power demand.
The price tag for cooling: 40 million a year.
With all of the reputational hits the financial sector has taken over the past few years perhaps this is our shot to make green the color of environmental responsibility instead of cold hard cash.
Need a Reprint?
Leave a Reply