Businesses are under great pressure to reduce their impact on the environment — and in every industry, data center power consumption is cause for significant and growing concern.
That’s in part because most data centers are not designed for optimal power efficiency. An IDC survey found that the average data center wastes more than $1.4 million each year in electricity. A separate study found that a single server rack can cost as much as $30,000 a year to run, so even a slight increase in data center efficiency can have a significant impact on a business’s bottom line, as well as its carbon footprint.
Servers account for 43 percent of power use in data centers, and cooling accounts for another 43 percent. Any effort to reduce energy usage in these areas could yield meaningful results. Here are some steps to consider:
How Businesses Can Control Their Servers
The majority of servers run at or below 20 percent utilization most of the time, yet they still draw full power. Virtualizing servers — consolidating multiple independent servers into a single physical server — can reduce energy costs by 10 to 40 percent.
Upgrading to servers with variable-speed fans or power management capabilities can offer a 30 percent increase in efficiency compared with standard servers. Look to the Federal Energy Management Program for acquisition guidance.
Place systems with similar heat load and temperature requirements together and consolidate data centers where possible, combining facilities located in separate areas of the same campus, for example.
Keeping Servers Calm, Cool and Collected
ASHRAE, a professional association for heating and cooling engineers, recommends keeping the temperature inside data centers between 64 and 80 degrees Fahrenheit. However, servers can throw off enough heat to create temperatures between 80 and 115 F.
Investigate free-air cooling — the use of outdoor air instead of traditional computer room air conditioners — if the outside air temperature is not higher than the temperature inside the data center. However, keep in mind that exterior air must be filtered and moisturized, which can consume a lot of energy.
Using hot- and cold-aisle containment is an older technique that aligns rows of racks so that the backs of the servers face each other. The aisles are enclosed to capture the air. The hot aisle pumps heated air to cooling units, while the cold aisle pumps cold air into the enclosed aisles. Both are more efficient than traditional air conditioning.
One breakthrough in data center cooling that holds promise is liquid cooling. Chipmakers that needed to increase processing power but ran into issues with excessive heat found that by physically applying liquid to the chips, they could operate at much higher temperatures than air-cooled chips.
Updating an existing data center with water-cooled servers is easy to do and offers immediate benefits. Sandia National Laboratories installed a liquid cooling system for its Attaway supercomputerthat resulted in a significant improvement over earlier water-cooled systems. Even older servers can be retrofitted by bringing liquid to the back of a rack of servers.
One benefit of this method is that the captured heat can be reused. For example, the hot water removed from the data center through liquid cooling technology could be used to heat a commercial building.
Manage Your Power, And Power Your Savings
Any process aimed at improving a metric must start with a current value, then find ways to improve it and validate it through subsequent measurement.
To improve data center cooling, begin by understanding the actual energy performance of the data center. Guides like the one produced by Energy Star can help determine initial and subsequent energy efficiency levels.
Power usage effectiveness is the common measurement for describing how efficiently a data center uses energy. The ideal PUE for a data center is 1; the average in 2020 was 1.58. Many colocation providers boast PUE values of 1.1, often as the result of free-air cooling and liquid cooling.
Next, identify opportunities for improvement. Once a baseline has been established, investigate areas that are ripe for improvement and can contribute to energy efficiency. A data center infrastructure management (DCIM) solution can help identify inefficient servers that should be eliminated, map optimal placement of equipment and more.
Implement improvements for quick wins. Make sure to avoid mixing hot and cold air. Look at virtualizing or consolidating servers and choose new servers wisely. Match server capacity to actual load; consolidate or virtualize those that have been deployed and configured for peak capacity.
Finally, use DCIM to measure progress, tracking critical metrics. By measuring progress against your baseline, you can work toward optimizing the entire performance of your data center. DCIM can give you visibility into — and control over — energy consumption and capacity.
The most carbon-efficient data center is one that you don’t build from scratch but refurbish instead. Look to liquid cooling to save energy and improve efficiency. Strive for continuous innovation: Be on the lookout for ways to reduce, reuse and recycle.
Efficiency is a crucial component in current and future power distribution, energy recovery, IT and environmental networks. In other words, monitor, manage and don’t be hesitant to share lessons learned.
A single server rack can cost $30,000 to run. Try these three steps to cut costs — and your carbon footprint.