Schneider Electric News
Datacenter Dynamics: The Cool Issue
By Peter Judge, August 3, 2015
Kick back, get an ice cream, and let’s talk about how to handle cooling in your data center.
The best way to beat the heat is to do as little as possible; any lizard knows that, and so do the coolest data center managers. Overworking just wastes energy.
But a lot of data centers are still doing just that – overworking and overcooling their spaces. In the process, they are wasting vast quantities of energy and – ironically – contributing to global warming and melting the world’s polar ice caps.
Cooling for the tape era?
Chilly data centers date back to the 1950s, when tape drives could not stand high temperatures, and humidity crashed the system by making punchcards stick together. We are no longer in that era, and yet a lot of data centers still have aggressive and unnecessary cooling regimes, rigidly keeping their ambient temperature at 21°C (70°F).
Things started to change when ASHRAE (the American Society of Heating, Refrigerating and Air-conditioning Engineers) defined where temperatures should be measured, with a specific recommendation for the server inlet temperature, which has been increasing as the increased reliability of IT kit is more widely accepted and now stands at 27°C (80°F).
Web-scale data centers pay attention to this, and through a process of trial are operating at higher temperatures still. But enterprise data centers are seriously lagging.
In early 2015, IDC surveyed 404 data center managers in the US, all of whom have at least 100 physical servers, and who have an average IT budget of $1.2m. Fully 75 percent of them were operating below 24°C, and only five percent were at 27°C or above.
These facilities have PUE (power usage effectiveness) ratings of around 2.4 to 2.8 – meaning that 60 to 65 percent of the power they consume doesn’t reach their IT equipment.
The result is doubly shocking when you consider two facts. First, IDC found out that these IT managers are spending 10 percent of their budget on cooling, out of a 24 percent segment for power and cooling combined. So each of these organizations is spending $1,200 a year on cooling, much of which may be unnecessary.
The other fact to consider is that, while the efficient web-scale cloud providers get the media attention, they are only a small percentage of the data centers of the world. At least half the world’s racks are in those overcooled enterprise sites. To make an impact on global emissions, these are the data centers that need to change.
Paranoia, or just being careful?
So why are data centers still too cool? Ian Bitterlin of Critical Facilities Consulting is in no doubt that fear is what drives it: “It’s paranoia.” People are excessively risk-averse.
But it might be more rational than that. At least one study has found that raising the air inlet temperature actually increased the amount of energy used in cooling.
“We went in fully sure we would be saving energy,” says Victor Avelar, senior research analyst at Schneider Electric, describing a study that compared the cooling energy needed at different temperatures for data centers in Chicago, Miami and Seattle. “But we found that above 27°C cooling took more energy and capital expense.”
The Schneider study – due to be published shortly – compared data centers with a standard chiller unit. The surprising result came about because of the complexity of the system. At higher temperatures, server fans come into play and more energy is used moving air around.
If you look into this, you will need to know your technology options. We are mostly starting – as the Schneider study did – from cooling using a traditional air-conditioning unit with a compressor, often referred to as a “direct expansion” (DX) unit.
Keeping your inlet temperature below 24°C - that’s paranoia
Ian Bitterlin, Critical Facilities Consulting
In most locations, there’s no other way to maintain the ultra-low temperatures that many people still think are necessary, and in many places the DX is in the mix to cover extreme conditions and reassure the service users.
If this is what you have, there are two main things you can do to cut your energy bills before you think of changing your cooling technology. First, as ASHRAE pointed out, you can feed your servers warmer air, thus cutting down the amount of cooling you do. Though Schneider also stresses that if you do this, you should know what the fans in your servers will be doing.
If you let the inlet temperature go up to 27°C, the temperature in the hot aisle at the back of the servers will be around 35°C. You will want to make sure all the connectors are on the front of the system, as no one will want to spend much time in the hot aisle.
Secondly, any cooling system works more efficiently when it is working on a high temperature difference (delta-T). That’s slightly counter-intuitive, but it’s basic thermodynamics: there’s a bigger driving force to move the heat when delta-T is greater.
This is one reason why it’s good to contain the hot air coming out of the back of your servers and exclude the cool air that slips past the racks. Hot-aisle containment means your cooling system is only working on the air that needs to be cooled.
Once you have done all that, your DX system will be doing less work, and you could have a partial PUE (pPUE) of around 1.16, according to Bitterlin. Alternatively a chilled water system (where the refrigeration unit’s cooling is distributed using water) can get down to a pPUE of 1.12.
Doing without DX
But do you need your DX at all? ASHRAE publishes maps showing where in the world the climate is cool enough so outside air can be used to cool a data center all year around. Most of the US is in this zone, and so is the UK, where the record dry bulb temperature is 34°C and the highest wet bulb temperature (with evaporation) is 23°C.
This is the world of “outside air” cooling, “free” cooling or “adiabatic” cooling – all words that mean cooling without using the air-con. Sometimes filtered outside air is circulated through the data center (“direct” free cooling) and sometimes a secondary circuit is set up (“indirect” free cooling). Adding water evaporation on the cooling coils can be needed when the outside temperature is higher.
This might get you to a pPUE of 1.05, says Bitterlin, but there are some complications. One issue is that PUE depends on the utilization of a data center. If there are unused servers, this can increase the PUE, but adiabatic cooling has an opposite trend: “Under a partial load, adiabatic gets better,” he says. This means that beyond a certain point, chasing a lower PUE can be counter-productive. “We caution against being enslaved to PUE and having all your future strategies dictated by it,” says IDC research manager Kelly Quinn.
PUE isn’t everything
Avelar agrees: “PUE has done great things for the industry, but it is important to not look at that blindly.” In his example, when the server fans switched on, the PUE of the data center would go down, even while the overall energy used to cool it was going up and its efficiency was getting worse.
Source: Spooky Pooka
Avelar warns that adiabatic cooling kit can raise availability concerns. These might be “paranoid,” but there are physical limits to what outside air can achieve, and in some parts of the world the concern will be justified.
More practically, adiabatic units are big and heavy, and will be tricky to retrofit into older data centers. New sites will try to put it on the roof, although it has to be fairly close to the IT floor.
Sound complicated? It all boils down to keeping a cool head and doing the math while your servers get warmer.
Check out the full article:
CRN: The Top 25 Channel Sales Leaders Of 2015
By Michael Novinson, August 3, 2015
July 29, 2015
Senior Vice President
APC by Schneider Electric
Rob McKernan, one of the great channel advocates, is driving a managed services channel charge at APC that is destined to forever change the power protection solutions landscape. A big plus for APC partners: McKernan is not only overseeing worldwide channels, but also R&D for products sold through partners. Look for an APC channel charge the likes you have never seen before thanks to McKernan's leadership.
Mission Critical: Schneider Electric’s Uniflair LE Room Cooling Series Receives DOE Certification For Energy Efficiency
August 3, 2015
The cooling series offer any data center environment a sustainable cooling solution and received DOE certification.
Schneider Electric has announced the Schneider Electric Uniflair™ LE Room Cooling series of solutions has received the Department of Energy’s (DOE) certification for energy efficiency. In a continued commitment to energy efficiency in the data center environment, Schneider Electric’s Uniflair LE was designed to offer any data center environment a sustainable cooling solution and received DOE certification under the new standards that encourage the deployment of green technologies. Schneider Electric is committed to providing products that meet DOE requirements, and in turn, the same guidelines held by the California Energy Commission (CEC), based on the “Appliance Efficiency Standards” from the DOE.
With the utilization of highly efficient electronically commutated (EC) fans, intelligent controls and optimization during part-load operation, the Uniflair LE exceeds new and future energy standards. By combining cutting-edge technology to optimize energy and environmental sustainability, the Uniflair LE perimeter units provide efficient cooling for any data center environment, with a low cost of ownership through economization and smart operation, all while maintaining a compact footprint. Additionally, the Uniflair LE Precision Cooling units are completely configurable to meet and adapt to any application for continuous and reliable operation.
“As a company founded on the principle that everyone should have access to safe, reliable and efficient, energy, we continue to provide the most environmentally-conscious and cost effective solutions to our customers in all facets of our evolving business,” said John Niemann, director of cooling product management, Schneider Electric. “The DOE energy efficiency certification is critical to demonstrate our commitment to providing customers with green solutions. Helping our customers reduce energy waste is at the forefront of what we do, so we’re especially pleased to be working with both the DOE and CEC to do our part and ensure that the industry is moving toward a more sustainable future.”
The DOE has regulated the energy efficiency levels of computer room air conditioners (CRACs) since 2012 to encourage the advancement of green solutions and has developed uniform test methods for the measurement of energy efficiency. The new DOE energy efficiency certification requirements establish a balanced standard that meets the need to decrease energy consumption, while also reducing operating costs for endusers.
Chief Home Officer: At Last, a Battery Backup for the Smaller Home Office Components
By Jeff Zbar, July 29, 2015
Working from a South Florida home office, summertime power black-outs and brown-outs are a common occurrence. See the clouds massing in the distance, and you know you could be hit with loss of power. You’re confident you can keep working; after all, your PC and monitor are linked to a battery backup or UPS (uninterruptable power supply). But what about your modem and router, or even your phone or headset? They draw so little power, but they’re so critical to business continuity.
APC by Schneider Electric has created the lithium ion Network Battery Backup + Mobile Power Pack. Available in September, this product is designed for those often overlooked devices. The UPS is ideal for ensuring power to the products that often are just as critical to ongoing productivity, no matter the conditions at the home’s power junction.
After all, it says so right on the box: “Exclusively for use with low-power devices: Broadband modems, Wireless routers, Voice-over IP or cordless phones, Connected home hubs, Smartphones and tables, and Wi-Fi home security cameras.
Of course, it’s not for the PC, fax or printer (in fact, most UPS makers and subject matter experts discourage using a UPS for peripherals like fax machines or printers, as they tend to draw too much power).
Measuring just 11 inches by seven inches by two inches, the Network Battery Backup is small enough to rest or hang almost anywhere, especially near a cluster of those devices. And it’s powerful enough to keep two such devices running in the short term, at least until power is restored.
The Network Batter Backup isn’t some behemoth created to power ALL those devices at once. It only has two outlets. But it’s a perfect complement to a larger, more robust battery backup, like the Back-UPS Pro 1300 that currently sits on my desk to provide clean power in event of a brown-out and close to an hour of power during full-fledged outages. It also offers computer grade surge protection and $75,000 in lifetime connected equipment protection.
And with the removable Mobile Power Pack, I never have to worry about my iPhone going dry of power during a busy day spent away from the home office. Once fully charged, just pull it from the slot in the Network Battery Backup and it will charge a smartphone five times or a tablet once.
Actually, it’s an ingenious design that way. How often do you go to leave the home office or head out for the day and only at the last minute, realize you might need mobile power? In my home office, the portable battery invariably is sitting in my shoe-rack tech holder bone-dry of power.
With the Network Battery Backup + Mobile Power Pack, I can be assured the power pack is charged and ready.
At five by two by one inches, it’s larger than many other portable batteries; it’s a bit bigger than my iPhone I use it to charge. But with five charges on board, it more than earns its slightly larger heft.
Power in the connected home and home office has been an evolution. Finally, some of the most important, yet historically overlooked devices, like the modem, router, phone or headset, are getting their due – and power.