Data center drives full recovery

Dec. 1, 2011
A medical center in Tornado Alley shored itself up against power surges and outages.

From the December, 2011 Issue of Cabling Installation & Maintenance Magazine

A medical center in Tornado Alley shored itself up against power surges and outages.

By Carol Everett Oliver, RCDD, ESS

Tift Regional Medical Center in Tifton, GA, located in the heart of Tornado Alley, experienced one too many “lights-out” power surges, which prompted a major network transplant. In 2008 a storm shut down the power and subsequently the network. Although the generator jumped into action, the transfer switch did not trip, resulting in failed uninterruptible power supply (UPS) systems. It was a hard stop that jolted the information technology (IT) department and hospital administrators to immediately put together a team to research and build a solid network and data center.

Tift General Hospital was built in 1965, but has experienced tremendous growth and became Tift Regional Medical Center 10 years ago. “When I started in 1967 there were 85 beds and today there are almost 200, serving 12 primary counties with an additional six counties as our secondary market,” states Sarah Thompson, vice president of operations and in-house historian. “We service 250,000 people and are also a rescue location when there are major disasters. So to properly serve them well, our doctors need to have up-to-date patient information at their fingertips. Designing and building a reliable data center became our primary focus,” she adds. “The hospital was behind this, as everyone has a vested interest.”

A highway of tiered Cablofil wire baskets was installed above the cabinets to separate the fiber and copper cabling pathways between Tift’s A and B networks.

Fast forward three years and with arduous and cooperative teamwork—which included the board of directors, the IT department, contractors and subcontractors—the new 15,000-square-foot Tift Regional Medical Center data center facility was opened. Today it is an exemplary N+1 data center with plans to double in size and capacity.

Team building

“When I came here 12 years ago we had six servers,” recalls Guy McAllister, assistant vice president and chief information officer (CIO) for Tift. “Today we are over 300. In addition, we’ve gone from 200 PCs to 1600. Our fast growth fueled the need for a data center and was realized by the hospital’s board of directors.”

The concept of the data center started in 2004 but the outages of 2008 became the catapult. “In 2005 the first round of HIPAA security initiatives came out and we moved from being a financially driven organization to being more IT-focused, to address electronic media and the importance from a business-continuity aspect,” McAllister adds. The IT department recognized the industry trends and government movements such as HIPAA (Health Information Portability and Accountability Act) and HITECH (Health Information Technology for Economic and Clinical Health) Act driving total electronic health records (EHR), computerized physician order entry (CPOE) and total Internet Protocol (IP) in a healthcare environment. This led McAllister and his team to coordinate a feasibility study on a disaster recovery site. The result was that they needed an efficient off-site data center and a reliable infrastructure.

For maximum efficiency of airflow and cooling, the Tift Regional Data Center equipment rows are arranged in a hot-aisle/cold-aisle setup.

After hiring a consultant to conduct research, it was decided to build a Tier 2, N+1 data center, which turned into a $10-million project. “I took this concept to the board in 2009 and explained the catastrophic effects of losing a compressor, and that an overheated computer room would shut down the entire network. They sat up to listen and the planning began,” notes McAllister. “Usually it is the lone voice of the CIO that tries to get funding for a data center. In this scenario, I was backed by a team who understood the need. Our CEO is a great visionary and he realizes the importance of technology and its related effects on the care we can provide to our patients by providing better informational tools to our doctors and clinicians,” he explains.

Once the board of directors approved the plans and budget, CooperCraft Communications (www.coopercraftcommunications.com), a major local contractor, was selected as the cabling installer through a bid process. “We are a relational hospital as everybody has a vested interest,” states Thompson. “Therefore when we chose to build the data center, we selected local contractors as we knew they had the skill sets, but also knew that the services provided by this hospital personally affect them.

“You could call this facility a ‘team building’ as it was an invested team that built it—from the financial backing of the board of directors to the hands-on work by the contractors and subcontractors,” adds McAllister.

Built for bandwidth

Healthcare facilities are built to last for decades. And although the buildings themselves can withstand time, the network infrastructure needs to be reevaluated periodically to keep up with emerging technologies. Upgrading the infrastructure can mean increasing bandwidth capacity for both backbone and horizontal cabling. The first step is assessing and locating the outside-plant cable that connects the campus and data center.

Tift Regional Medical Center, which is a 20-building campus, was operating off of a “shoestring” backbone. The construction team found that the entire outside plant consisted of a single 24-strand fiber cable, installed in 1998, that ran under the street from the old computer room randomly located in what previously had been a doctor’s office before converting to a financial building. The cable ran through three pull boxes. Using metal fishtape, CooperCraft located the cable and pull boxes in order to add a 12-string singlemode outside-plant fiber-optic cable for 10-Gbit/sec connectivity to the storage area network (SAN) located in the old data center. CooperCraft also ran two, 72-strand singlemode outside-plant fiber-optic cables through diverse paths to the new offsite data center.

“We installed Berk-Tek’s ArmorTek singlemode cables between the A-side and the B-side of the existing hospital network,” says Jonathan Ouzts, RCDD, structured cabling project manager for CooperCraft. “The armored cable saves time and money in installation because it eliminates the need for innerduct.”

“We are utilizing Cisco Nexus network switching gear that large hosting data centers and central offices utilize,” explains Wade Brewer, CSE, CTS, CSS, LVUC, director of technology services at Tift Regional Medical Center. “For these new core switches, we have nothing less than 10G routing in the facility. We have multiple 10G ports on each switch, which now gives us the capability to push 10G from our core out to our other facilities.

Tift Regional Medical Center serves 12 primary counties, which includes an estimated 250,000 residents in southern Georgia.

“As for storage, we have several SAN environments and are probably at a minimum of 100-terabyte capacity. Our main Internet bandwidth will be two, 50-Meg circuits with capacity to expand to 100 Meg on the fly if need be. We are built for network redundancy from the cabling to switching.”

Total redundancy

The reference to an N+1 data center equates to total redundancy in critical areas of cooling, data cabling, storage and power. The new Tift Regional data center consists of 15,000 square feet in a standalone building with plans to double the size on the same piece of property by knocking through the existing back wall. Because power was the main thrust behind building this data center, there are two backup generators with a 500-kW transfer switch. Cooling is provided through 10 Liebert computer room air conditioning (CRAC) units, five per side.

The data center has two redundant networks, referred to as “A” and “B,” tied to the main network core. There are currently five rows of server and storage cabinets, with about 15 cabinets per row lined up back-to-back in a hot-aisle/cold-aisle formation. The network core that houses the core switches is located against the wall instead of situated between the SAN area. “The reason for this layout is so that when we expand out to a total of 4,000 square feet, the network core will be located in the middle,” says Jerry Cooper, chief executive officer of CooperCraft Communications. “This is currently in a five-year growth plan but they may be expanding faster than anticipated,” he adds.

A highway of tiered Cablofil (www.cablofil.com) wire baskets was installed above the cabinets to separate the fiber and copper cabling pathways between the A and B networks. Berk-Tek’s (www.berktek.com) 24-strand Adventum multimode cables, built with Om3 50-micron fiber, connect the core network to the two edge switches on the ends of each server row for both networks.

“All of the fiber was field-terminated, which totaled 5,088 anaerobic LC terminations for the horizontal multimode fiber, an additional 630 for the backbone and 530 singlemode for the backbone,” states Ouzts. “We connectorized all the fiber cable at our facility and prebuilt the assemblies in fire-retardant mesh pulling eyes, which were like socks, to make it easier for pulling through the pathways and into the cabinets,” he describes. “In addition, every fiber cable was bidirectional tested with an optical power meter test and the backbone was additionally referenced and documented with OTDR traces,” he adds. All the fiber cable was terminated into Ortronics (www.ortronics.com) Rack Mount Fiber Enclosures with six-port LC duplex adapter panels, which were mounted into the cabinets.

Copper connectivity consisted of 10 Berk-Tek LANmark-1000 enhanced Category 6 cables to each server cabinet, or 20 per A and B side, totaling 1,060 horizontal connections. The copper cabling ran in different pathways and is terminated into Ortronics Clarity6 24-port patch panels, which are also segregated by ports for the two networks. The Berk-Tek cables and corresponding Ortronics patch cords were color-coded to differentiate the two networks—green for A and blue for B.

Additional Category 6 cabling was used for the 150 telecommunications outlets in the administration, IT and help-desk offices located in the data center building. The IP access-control and security-camera systems were also connected to the network over the LANmark-1000 cable, and designated purple.

Assuring efficiency

“Selecting a cabling system vendor involved looking at quality, workmanship and experience,” explains Brewer.
Jonathan Ouzts, RCDD, structured cabling project manager (left) and Jerry Cooper, CEO of CooperCraft Communications (right), pose inside the Tift Regional Data Center.“We narrowed the selection down to four different brands and did a ‘bake-off’ to compare the parts and pieces. We selected Berk-Tek and Legrand Ortronics due to their products and reputation in the industry.”

“Together Berk-Tek and Legrand provided comprehensive 25-year warranties for their copper and fiber cabling systems—NetClear SM for the outside-plant singlemode fiber, NetClear MM10 for the 50-micron fiber backbone and NetClear GT2 for the enhanced Category 6 copper horizontal cabling. In addition, the workmanship is also guaranteed since CooperCraft is one of NetClear’s trained and certified installers,” states Don Brady, manufacturers’ representative with Linc Inc. “As a result, the Berk-Tek and Legrand cabling products have been adopted as the hospital standard going forward.”

“Since the power failure was our wake-up call, we wanted to make sure that this data center can withstand all of the environmental elements that we face in this part of the country, referred to as Tornado Alley,” notes McAllister. “To assure maximum operational efficiency, we hired an outside firm to do a full commissioning of the data center before we opened, even though commissioning is not required.

“The commissioning firm, Working Buildings (www.workingbuildings.com), had been engaged through most of the construction. They looked at the blueprints at the early stages of construction and periodically inspected the work of the subcontractors. They took pictures of anything under question and met with the subcontractors to clarify, which either led to a design change, fix or clarification.”

Thompson adds, “Commissioning is not just a precaution, but a necessity. We need to make sure that we have a very secure environment for our data center so that we can deliver our services.”

The commissioning process and testing included stressing all the power generators, UPS units, CRAC and other facility operations, inside and out. Power tests include applying load banks on the UP units and testing the circuits. The generators run for at least four hours under load. To test the CRAC units, heating units are installed under the raised floor to elevate the temperature. The water tests include applying high-pressure water, known as a rain curtain, to the roof and against the building. This building has been tested to withstand 200-mph winds as well.

Tift Regional Medical Center serves 12 primary counties, which includes an estimated 250,000 residents in southern Georgia.

Network lifeline

Tift Regional Medical Center has a progressive growth plan for its data center and IT services. With the new data center, IT is now moving toward a fully implemented CPOE system and total EHR, which will improve health-care quality and efficiency through information exchange. “With our new network and reliable data center, we expect that our infrastructure will support greater requirements for using health information exchange and implement a certified EHR system,” notes McAllister.

Wade Brewer, CSE, CTS, CSS, LVUC, director of technology services at Tift Regional Medical Center (left), poses with Jerry Cooper, CEO of CooperCraft Communications (right).The HITECH Act of 2009 offers incentives to physicians (from Medicare and Medicaid) for “meaningful use” of certified EHR technology, starting in 2011. Certified EHR technology gives assurance that an EHR system or module offers the necessary technological capability, functionality and security to help the healthcare provider meet the meaningful-use criteria. Certification also gives patients peace of mind that the electronic health IT systems they use are secure, can maintain data confidentially, and can work with other systems to share information.

“Three years ago if I wanted to implement CPOE, I couldn’t do it without an efficient data center,” notes McAllister. “There are three stages outlined by Center for Medicare and Medicaid Services that defines meaningful use of EHR. We will complete the requirements of Stage 1 by next year and the other two stages will follow.”

Another critical certification for all healthcare facilities is the Joint Commission Organization (JCO) accreditation. JCO provides operational standards and provides performance evaluations every three years for reaccreditation. The JCO accreditation is based on criteria outlined for specific healthcare organizations such as ambulatory care, home care, hospitals and others. “While we were being evaluated this year, the JCO representative actually reviewed quality measures of our network to see if we are working toward becoming totally electronic,” notes Johnson.

From the onset, the IT goal was to provide a protective environment for all data for the life of the hospital. “We have a saying that every time we add a server, it immediately gets outdated,” remarks McAllister. “In fact our fire drill in IT is that everyone grabs a server and runs out. It is important that we have the infrastructure to give us the capability to store records. With electronic storage, we can create a database that will allow comparative studies for research that will help disease control for not only our region, but nationally and possibly globally. The importance of this project cannot be overstated.”

Carol Everett Oliver, RCDD, ESS is marketing analyst with Berk-Tek, a Nexans company (www.berktek.com). Beginning in 2012 she will serve as BICSI’s U.S. Northeast Region Director, for the 2012-2014 term.

More CIM Articles
Past CIM Issues

Sponsored Recommendations

imVision® - Industry's Leading Automated Infrastructure Management (AIM) Solution

May 29, 2024
It's hard to manage what you can't see. Read more about how you can get visiability into your connected environment.

Global support of Copper networks

May 29, 2024
CommScope designs, manufactures, installs and supports networks around the world. Take a look at CommScope’s copper operations, the products we support, our manufacturing locations...

Adapt to higher fiber counts

May 29, 2024
Learn more on how new innovations help Data Centers adapt to higher fiber counts.

Going the Distance with Copper

May 29, 2024
CommScopes newest SYSTIMAX 2.0 copper solution is ready to run the distanceand then some.