Modern data centers look entirely different than just a short decade ago, with many of them hosting cloud-based services. Strategies for these centers’ style and design continue to change under a load of mobility, big data, cloud computing, and a host of other technology trends.
A data center’s infrastructure has now expanded beyond the simple brick and mortar walls where it is housed to include public clouds. These clouds are where data that was once thought too mission-critical to leave the building is now hosted.
Every data center has to be extremely scalable, and its component systems have to be faster each year. IT pros in the center have to be able to take on all tasks, and the business will, of course, dictate that all the magic be accomplished with a minimum budget.
There are features common to every data center, however, or that should be common.
Table of Contents
The first is security: If someone gains physical access to your servers, then it’s all over from the get-go. They can walk off with millions of credit card transactions, mission-critical data, cancer research, or whatever else happens to be stored on those hard drives. Somewhere on the Eastern Seaboard, the credit card giant Visa has built their Operations Center East, where millions of transactions are processed and analyzed — sometimes over 10,000 transactions per second. Opened in 2009, the center’s approach has hydraulic bollards that can be lifted at a moment’s notice if a vehicle is approaching at a threatening speed. If those don’t stop it, then the hairpin turn will send it into a drainage ditch that serves as a moat.
Failover Power and Sturdiness
It’s also important for a data center to have a power source if local municipal power fails, or worse if there’s a natural disaster. The Visa center is built to withstand hurricane-force winds of 170 miles an hour or more, as well as earthquakes. That means that it can shake off a Cat-5 tornado. In the event of a power failure, it has diesel generators onsite to generate enough electricity for 25,000 households; they’ll keep the center alive for up to nine days.
It would help if you also thought about getting uninterruptible power supplies — basically big batteries — for your servers to ensure there are no power surges to threaten the system. They also condition the incoming power for a smooth, clean flow.
Of course, you’re going to have to keep all of your equipment connected — otherwise; there’s little point in having a data center. In Visa’s center, all of the mainframes, switches, and storage arrays are connected with roughly 3,000 miles of cabling. That’s a lot of planning and Cat-6.
The computing machinery in your data center is going to generate a lot of heat, so you’ll have to think about ways of cooling it down before it all melts into a useless heap of slag. One way of doing it is to build your data center next to a river and use its water for cooling. That way, you may not need to use the typical electric refrigeration units or chillers. This is what Facebook did with its most recent data center in Prineville, Ore., east of the Cascade Mountains. The water evaporates easily as warm ambient air passes over water seeping over a membrane or through a sprayed mist.
Lastly, when building a data center, you have to be concerned with its capacity to handle data. Here’s where we can glean some useful information from the Visa model again. As stated above, the system routinely handles around 10,000 transactions a second. But it does have an upper limit, which is somewhere around 24,000 transactions a second. Somewhere over that limit, the network doesn’t stop processing one message. It brings the entire system to a grinding halt, and no messages get processed.
That’s why Visa is working on expanding their processing facilities; that’s why each major system has a backup; and that’s why the system is strong enough to withstand a Midwestern tornado, a Florida hurricane, or a Midwestern super-tornado.
What’s the only thing that can take the center down? That’s us—our rampant transaction growth.
Image Source: Terminal in server room