State-of-the-art Toronto data centre includes redundant critical cooling infrastructure
March 1, 2010
PEER 1 Hosting Inc. spent more than two years designing and building its US$40 million state-of-the-art facility in Toronto, which opened for business earlier this year and offers co-location, dedicated and managed hosting to small-and mid-sized enterprises.
Ryan Murphey, vice president of facilities and data centre operations for the Vancouver-based firm, explains that the goal all along was to get it right the first time. It marks the company’s 17th and largest data centre to date and according to PEER 1 is more than twice the size of a professional hockey rink.
“This represents our design of the future,” says Murphey. “We went through and extensive search on every piece of infrastructure that was going in.”
Vendors include Eaton Corp., which supplied the majority of the low voltage and medium voltage switch gear, Liebert Corp. for Power Distribution Units (PDUs) and UPS equipment and Leviton Manufacturing Co. Inc., which is the primary structured cabling supplier.
The 12,190 square metre facility was built in four sections with each containing what PEER 1 calls a Performance Optimized Data Centre or POD.
Each of the four pods has the capacity for upwards of 270 cabinets, which the company says is equivalent to an estimated 7,500 servers.
The first POD opened in late January and includes 2,285 square metres of space and an additional 2,438 square metres of office space, and inventory, network and storage areas to support the facility.
In terms of the cabling infrastructure, there is currently 25.8 kilometres of fiber and 13.5 kilometres of Category 6A copper installed. Those numbers will rise to 92.7 kilometres and 43.8 kilometres, respectively when all four PODS are completed and in use.
“We started with a shell building,” says Robert Miggins, senior vice president of business development with PEER 1. “This is the first time we have done a build of this scale and that created opportunities for us to embrace more of the green energy initiatives that we typically would not see in past projects.”
This includes a redundant cooling infrastructure that uses both a local well for primary water supply and a connection to the city’s water system.
“We are also using redundant high-efficiency Variable Frequency Drive centrifugal chillers to reduce cooling costs and condenser towers with economizers,” says Murphey. “This enables PEER 1 to lessen the environmental impact of the data centre by taking advantage of free cooling when the temperature drops below 10C.
He adds that when it comes to data centre builds, “it’s becoming more and more common that a customer will question you on the green aspects of your operation.”