the newsletter of tbd consultants - 3rd qtr 2013

Printable PDF version
Subscribe to our newsletter

In this Edition

Data Centers
Keeping on Top of Data
Market Data

Construction Management Specialists

111 Pine Street, Suite 1315
San Francisco, CA 94111
(415) 981-9430 (San Francisco office)
 
9705 Cymbal Drive
Vienna, VA 22182
(703) 268-0852 (Washington, DC office)
 
4361 35th Street
San Diego, CA 92104
(619) 550-1187 (San Diego office)
 
8538 173rd Avenue NE
Redmond, WA 98052
(206) 571-0128 (Seattle office)

 
www.TBDconsultants.com

 

Data Centers

Back in the really early days of computing, whole rooms or buildings were filled with electrical equipment to carry out the calculations, and that was accompanied by massive cooling systems to extract the heat generated. Then, by the time of the 1970s and 80s, microprocessors had made it possible for computers to sit on your desktop, and the days of these rooms full of electronic equipment seemed numbered. The military, research institutions, and even some large companies might have a need for the massive calculating power of a supercomputer or the combined power of banks of smaller computers, but the smaller PC could handle most needs.

But then the Internet grew, and banks of servers were needed to dish up the Web pages in increasing volume, and those pages started to turn into Web applications, requiring more processing power behind the scenes. Streaming video started to fill the bandwidth, and business of all kinds started to go online, from selling books and anything else you could think of, to online meetings and banking. Now the Internet has become the primary communications medium and also the place where people are storing and backing up all their photos, music and data in general. All that online activity has made the need for those rooms and buildings packed with electronic equipment more essential than ever, and data centers are being built in increasing numbers as our lives are going online.

Data centers are categorized into four tiers, ranging from Tier 1, which can be a single server room, through to Tier 4, which would normally be one or more dedicated buildings. A Tier 1 data center can be expected to provide around 99.671% availability (in other words, to be offline not more than 0.329% of the time), and a Tier 4 providing 99.995% availability.

The simple way for a company or university to add a data center is to have one delivered ready-made. They can arrive looking like a large cargo container, but one that is packed to capacity with computing power, and all that’s needed is to provide a foundation for it and connect it up to a power source and the data network. But that is not the most aesthetic solution, or the most secure, and will not suffice for all situations, but the idea of standardization of infrastructure normally carries through to other data centers as well.

Security is normally a big consideration at data centers. Apart from the fact that it contains a lot of valuable equipment, it will also be storing a lot of valuable information, including company secrets, bank account and other personal information. Plus, it damages the reputation of the company operating the data center if service goes down for any reason, or if confidential information gets stolen. Physical security, in the form of fences, security guards, fire control and access control on the building is one style of security. Similar types of security, but in electronic form, are of equal or probably even more importance to guard against hackers (or to be more pedantically correct, ‘crackers’).

The security of the building itself can be at risk from natural disasters, so the location of data centers can be very important in minimizing these kinds of risks, as well as the building design. To ensure that service can continue uninterrupted, siting the data center in a location with a reliable power supply is also of primary concern. Alternative power sources, and emergency generators, would normally be expected to be part of the design as well. Redundant data connections are also normally provided to try to ensure continuous connectivity, along with redundant HVAC capacity to ensure a correct environment can be maintained despite breakdowns.

Having anyone around the building can be a security concern, but some data centers are automated to the point that they become ‘dark data centers’. These operate with no staff in the building, except for maintenance or in emergency situations. Management of the server and other equipment is handled remotely.

There are data centers that are part of an office building, there are those that are dedicated structures rather like giant warehouses stacked with electronic equipment instead of goods for shipping, others are housed in converted buildings.

Ideally they should be geographically reasonably close to the end users they are serving, to minimize any delay between requests being sent from a user’s computer and the requested information being returned. People are not very patient when it comes to waiting for a Web page to be returned, and in the case of a business application that is housed in the proverbial cloud, such delays are not expected or generally tolerated. Consequently the big service providers have built data centers around the world, and are continuing to add to them.

A building that relies almost completely on a clean energy source like electricity might sound like an ideal green building, but some of these data centers can use as much energy as a not-too-small town, and the generation of that much power normally adds substantial pollutants into the environment. Plus, the banks of servers that fill the data centers generate large quantities of heat that needs to be removed. Building ‘green’ data centers has become a high priority, and there are LEED Gold data centers around. Since these buildings generate more heat than they need, siting the building in an environment that facilitates natural cooling is an advantage (e.g. Sweden and Alaska have been selected as a locations for a number of large data centers). Temperature is not the only issue for data centers. Humidity must also be controlled because static electricity can result if humidity is too low, and that can be a serious problem for electronic equipment.

Another location issue is finding a site that can supply reliable and inexpensive power from renewable sources. Energy reductions are also being achieved by using virtualization techniques to make more effective use of the physical servers. It is also important to have a site large enough for future expansion, and, because of the frequent need to expedite construction on very tight schedules, being located in a region where planning restrictions are more relaxed and less subject to delay is advantageous.

With the continuing move to mobile devices for computing, and the related requirement for accessing your data from any location, the need for new and upgraded data centers will continue to grow.

     
 

Keeping on Top of Data

Data doesn't only reside in data centers, it also fills our ever-growing hard drives and other storage media. In this article we look at some of the tools for managing your own data.

    
 

Market Data

The stock market has been going through wild swings as it reacts (or over-reacts) to the latest snippet of information. Here we take a look at the economic data from around the world, and try to make sense of what it all adds up to.

    

 

Design consultant: Katie Levine of Vallance, Inc.