Isolation is always key in airflow-management efforts

Hot and cool air are to a data center administrator what client funds and personal funds are to a real estate broker; commingling them leads to trouble.

1403cimdatacenter Photo 1

From the March, 2014 Issue of Cabling Installation & Maintenance Magazine

Isolating hot air from cool sometimes means containing an entire aisle; it always means having a strategy that acknowledges change.

By Patrick McLaughlin

Hot and cool air are to a data center administrator what client funds and personal funds are to a real estate broker; commingling them leads to trouble. In the case of real estate, the trouble is of the legal variety. In the data center, the story can be far more complex. The commingling of hot and cool air almost always is unintentional, the trouble caused is not always readily diagnosed, and the remedy does not always have clear "sentencing guidelines," to use the legal analogy one final time.

The costs of inefficient airflow management can be summed up in two words, according to Peter Crook, president and chief executive officer of Upsite Technologies (www.upsite.com): "Energy and uptime." In an interview with Cabling Installation & Maintenance, Crook was asked if, when it comes to airflow management, data center administrators fall into a bell curve-style grouping in which a small percentage are at the forefront of efficiency, another small percentage lag far behind, and a majority fall in the middle. That description does not fit, he explained. Rather, he said, "It has to do with where you are in the data center lifecycle. If you are starting from scratch with a new data center, you'll plan, think through options, and move forward." Administrators with this opportunity can conceive of and implement an airflow-management strategy for a computing system and physical plant that are known. But that description does not apply to everyone. Crook continued, "On the other end of the spectrum are those who have an existing facility and are faced with packing densities within it." In most or all of these cases, an original airflow-management design that well-accommodated the original computing environment simply cannot effectively handle this altered, more-densely packed space.

1403cimdatacenter Photo 1
The KoldLok 2-post rack grommet prevents bypass airflow by sealing off the rectangular area in the 2-post structure of the rack, while also preventing dust and other debris from getting trapped in hard-to-reach areas.


Encountering the unexpected

How data center administrators deal with these additional densities is precisely the challenge. And some administrators facing that challenge may not have been planning, and may not be equipped, to do so. Specifically--but by no means exclusively--the multitenant, colocation facility is a current battleground. Increasing numbers of enterprises are outsourcing computing services and in doing so, are shifting the challenges of data center administration to these outsource providers. This shift is happening in such significant numbers that many colo facilities are facing the proverbial "problem they like to have" by having an abundance of business, and its corresponding administration challenges. And though it's the "problem they like to have," it is a problem nonetheless.

Pg12
Simplex Isolation Systems' AirBlock Gap Seals are available in models that seal gaps from one-half-inch to three-inches in width, and from one-half-inch to six-inches in width.

The lack of a cohesive, comprehensive airflow-management strategy typically results in locally concentrated, high-energy-consumption cooling efforts. Recently, Upsite Technologies assigned some quantifiable numbers to these types of efforts, illustrating how inefficient they can be. A white paper published by Upsite and co-authored by its founder, the late Kenneth Brill, as well Upsite senior engineer Lars Strong, P.E., is titled "Cooling Capacity Factor (CCF) reveals stranded capacity and data center cost savings." In that paper Upsite introduces the CCF concept, which is a quantification of the amount of cooling capacity in a data center in relation to the facility's heat load. Upsite examined 45 facilities when compiling information for the document.

Addressing CCF

The white paper's executive summary explains, "The average computer room today has cooling capacity that is nearly four times the IT heat load … This white paper will show you how you can calculate, benchmark, interpret and benefit from a simple and practical metric called the Cooling Capacity Factor.

"Of the 45 sites that Upsite reviewed, the average running cooling capacity was an astonishing 3.9 times (390 percent) the IT heat load. In one case, Upsite observed 30 times (3,000 percent) the load. It is hard to believe that sites are this inefficient.

"When running cooling capacity is excessively over-implemented, then potentially large operating cost reductions are possible by turning off cooling units and/or reducing fan speeds for units with directly variable fans or variable frequency drives (VFD). Though a great deal of focus is placed on improving computer room cooling efficiency, the average data center could reduce their operating expense by $32,000 annually simply by improving airflow management (AFM).

"AFM improvements increase cooling efficiency, which could result in immediate operating cost savings and greater IT system reliability. As cooling represents approximately half of infrastructure costs, PUE [Power Usage Effectiveness] improves as well. With a reduction in energy usage, everyone benefits, as carbon emissions are also reduced.

"The same AFM improvements also release stranded cooling capacity, which enables companies to increase sever density without the capital cost of additional cooling equipment. Improved cooling utilization may also extend the life of a site, deferring capital expenditure required to build a new data center.

"Numerous solutions are designed to improve cooling efficiency, ranging from something as simple and important as blanking panels to complete containment systems. Hype abounds when it comes to the potential benefits of each new ‘best practice.' How can you truly know what potential there is to make a difference? Will you be able to deploy more IT equipment? Will you eliminate hot spots and/or reduce the PUE for your data center? How much of a difference will improved AFM make at your site? To make informed decisions about investing in additional cooling capacity or AFM initiatives, you should first determine how well you are utilizing your current resources.

"Calculating the CCF is the quickest and easiest way to determine cooling infrastructure utilization and potential gains to be realized by AFM improvements."

The nine-page white paper provides a historical perspective on data center layouts, findings from the 45 sites Upsite analyzed, benefits of "right-sizing" cooling resources, as well as how to calculate and interpret your own facility's CCF.

Products and services

As the paper's executive summary stated, a number of solutions are available for improving airflow management. Upsite offers a number of such solutions. Most recently the company introduced the KoldLok 2-Post Rack Grommet. When making the introduction in December, Upsite noted, "The easy-to-install KoldLok 2-Post Rack Grommet prevents bypass airflow by sealing off the rectangular area in the 2-post cabinet, while promoting data center cleaning best practices, preventing dust and other debris from getting trapped in hard-to-reach areas."

1403cimdatacenter Photo 3
Chatsworth Products Inc.'s Build To Spec Hot Aisle Containment Solution arrives on a pallet, as shown here, and contains eight unique components to field-fabricate a duct over a contained aisle.

The company listed among the grommet's features a split design that allows for retrofit installation; a design that works with variations in rack-material thickness; an installation that requires no cutting, modifying or tools; and RoHS compliance.

Prior to that introduction, and in conjunction with its establishment of the CCF metric, Upsite introduced the EnergyLok Cooling Capacity Assessment. "A vendor-neutral service, the Cooling Capacity Assessment works in three steps," Upsite explained in September 2013 when announcing the service. "The first step is to calculate the computer room CCF, which indicates the level of over or under cooling. The on-site review then takes a comprehensive assessment of the data center's current AFM, with a checkout meeting to explain the initial findings. Finally, a full customer report is provided to identify specific improvements for best utilizing the cooling infrastructure, followed by a teleconference to discuss the details of the report."

Upsite's Strong said at the time that the service will enable data center managers "to save money, maximize capacity, and improve IT reliability. Our assessment applies our knowledge of the CCF and AFM best practices, and our checkout meeting and conference call ensure that a data center fully understands the gathered information to implement the best solutions to fit their cooling needs and maximize their capacity."

Management tools

Another set of airflow-management tools was introduced by Simplex Isolation Systems (www.simplexstripdoors.com) in 2013. The AirBlock Flexible Gap Seals (FGS) "increase cooling efficiency in your data center by closing off those hard-to-seal gaps of a half-inch to six inches between server racks," the company explained. "The result--drastically reduced energy costs and added capacity in your data center."

The FGS seals attach to racks with a flexible magnetic side trim, Simplex explains, and they are designed to be easily installable. "Squeeze the two magnet edges together and slide into the gap," the company instructed. "The magnets seal to the sides of the racks, completing the installation. Trim the seal to fit the rack height with a knife or scissors."

They are available in two materials: black PolySim polyurethane and black FlexSim PVC. "Both materials are available only from Simplex and are Class 1 fire-rated to meet NFPA 701 and NFPA 76 standards for data center applications," the company added. "PolySim is a non-outgassing material and has permanent static-dissipative properties."

Another recent introduction in the airflow-management realm is from Chatsworth Products Inc. (CPI; www.chatsworth.com) and is an innovation in the deployment of aisle-containment systems. CPI has an entire set of solutions dedicated to aisle containment. In February it debuted the Build to Spec (BTS) Hot Aisle Containment (HAC) Solution.

As CPI explained when announcing the BTS HAC Solution, it "arrives on a pallet and includes eight unique components to field-fabricate a duct over a contained aisle. The solution can be used over a mix of cabinets of varying sizes and can be ceiling- or cabinet-supported, making it ideal for retrofit applications."

"The BTS Kit allows you to be really flexible, whether you're in a new or retrofit installation," explained Sam Rodriguez, CPI's product manager of cabinet and thermal management solutions. "The design allows you to custom-fit the containment to the cabinets in a data center environment and addresses a multitude of variables that are site-specific, while still maintaining a very effective seal to maximize performance. The BTS Kit allows you to deploy the containment regardless of the dimensions of the installed cabinet and equipment."

CPI listed the kit's other features as an elevated, single-piece duct that allows cabinets to be removed, omitted or replaced as required; translucent duct and door panels that allow light to enter the contained aisle; and doors that close automatically to maintain containment and reduce recovery time.

Concepts that may seem abstract on the surface--like a numerical quantification of cooling-resource use or the presence of different-height cabinets in a data center--become entirely practical when their efficiency directly affects a facility's energy bill and its ability to stay up and running. As we have stated in these pages many times, cable-management practices like overhead-or-underfloor routing and proper dressing away from network-equipment exhaust can and do affect airflow management. Likewise, opportunities exist at the rack, aisle and room levels to maximize the efficiency of airflow management.

Patrick McLaughlin is our chief editor.


Peter Crook returns to Upsite as top executive

Peter Crook returned to Upsite Technologies on January 20 as its president and chief executive officer--a position he also held with Upsite from 2003 to 2009 alongside the organization's founder and chief technology officer, the late Kenneth Brill.

"Upsite Technologies is well recognized in the data center space and has a strong base of global customers from which to continue to grow the business," Crook said when his return was announced. "I am excited to take the leadership role once again as Upsite gears up to release a number of new products and services for data center airflow management."

He brings more than 25 years of executive management experience to Upsite, including significant experience as startup CEO, vice president, product manager and sales and marketing manager in software, services and manufacturing spaces. Most recently he was managing director with EA2, an executive management consulting company focusing on business, organizational and product development for small to medium-sized high-technology companies. Prior to his previous tenure with Upsite, Crook was vice president of development for LizardTech, which was acquired by Celartem/Extensis; there he drove the commercial development of MrSID, which is now a global geospatial standard for image storage and retrieval.

Mark Germagian, chief technology officer of Geist Global and an executive board member for Upsite, commented, "We are delighted to have an industry veteran like Peter take the leadership role at Upsite Technologies. Peter's deep knowledge of the data center market and IT product development will greatly benefit the company as it prepares to launch a new suite of products."

More CIM Articles
Archived CIM Issues

More in Power & Cooling