Infotx

Web Hosting - Data Centers

Data Centers

 

A data center is a facility used to house computer systems and associated components, such as telecommunications and storage systems. It generally includes redundant or backup power supplies, redundant data communications connections, environmental controls (air conditioning, fire suppression, etc.), and special security devices.

 

Contents

 

History

Data centers have their roots in the huge computer rooms of the early ages of the computing industry. Earlier computer systems were complex to operate and maintain, and needed a special environment to keep working. A lot of cables were necessary to connect all the parts. Also, old computers required a lot of power, and had to be cooled to avoid overheating. Security was also important; computers were expensive, and were often used for military purposes. For this reason, engineering practices were developed since the start of the computing industry. Basic design guidelines for controlling access to the computer room were devised. Elements such as standard racks to mount equipment, elevated floors, and cable trays (installed overhead or under the elevated floor) were introduced in this early age and have modernized relatively little compared to the computer systems themselves.

 

During the boom of the microcomputer industry, and specially over the 1980's, computers started to be deployed everywhere, in many cases with little or no care about operating requirements. However, as IT operations started to grow in complexity, companies grew aware of the need to control IT resources. With the advent of client-server computing, during the decade of 1990, microcomputers - now labeled as servers - started to find their places on the old computer rooms. The availability of inexpensive networking equipment, coupled with new standards for network cabling, made it possible to use a hierarchical design which put the servers in a specific room inside the company. The use of the term 'data center', as applied to specially design computer rooms, started to gain popular recognition about this time.

 

The boom of data centers came during the "Dot Com Boom". Companies needed fast Internet connectivity and non-stop operation to deploy systems and establish a presence on the Internet. Installing such equipment was not viable for many smaller companies. Many companies started building very large facilities, called "Internet Data Centers", or IDCs for short, which provide businesses with a range of solutions for systems deployment and operation. New technologies and practices were designed to handle the scale and the operational requirements of such large scale operations. These practices eventually migrated towards the private data centers, and were largely adopted because of their practical results.

 

Today, data center design, construction and operation is a well known discipline. There are standard documents from accredited professional groups, such as the TIA, which specify the requirements for data center design. There are well known operational metrics for data center availability, which can be used to evaluate the business impact of a disruption. There is still a lot of development being done in operation practice, and also in environmentally-friendly data center design.

 

Requirements for modern data centers

IT operations are a crucial aspect of most organizational operations. One of the main concerns is business continuity; companies rely on its informations systems to run its operations. If a system becomes unavailable, company operations may be impaired or stopped completely. It is necessary to provide a reliable infrastructure for IT operations, in order to minimize any chance of disruption. Information security is also a concern and for this reason a data center has to offer a secure environment which minimizes the chances of a security breach. A data center must therefore keep high standards for assuring the integrity and functionality of its hosted computer environment.

 

Data center classification

The TIA-942:Data Center Standards Overview describes the requirements for the data center infrastructure. Four tiers The simplest is a Tier 1 data center, which is basically a computer room, following basic guidelines for the installation of computer systems. The most stringent level is a Tier 4 data center, which is designed to host mission critical computer systems, with fully redundant subsystems and compartmentalized security zones controlled by biometric access controls methods.

 

Physical layout

A data center can occupy one room of a building, one or more floors, or an entire building. Most of the equipment is often in the form of servers racked up into 19 inch rack cabinets, which are usually placed in single rows forming corridors between them. This allows people access to the front and rear of each cabinet. Servers differ greatly in size from 1U servers to huge storage silos which occupy many tiles on the floor. Some equipment such as mainframe computers and storage devices are often as big as the racks themselves, and are placed alongside them.

 

1U represents one rack unit of space. A Rack Unit is 1.75 inches in height (44.49 mm). The sizes are believed to have been derived from telecommunication equipment used in the second world war.

 

The physical environment of the data center is usually under strict control:

  • Air conditioning is used to keep the room cool; it may also be used for humidity control. Generally, temperature is kept around 20-22 degrees Celsius (about 68-72 degrees Fahrenheit). The primary goal of data center air conditioning systems is to keep the server components at the board level within the manufacturer's specified temperature/humidity range. This is crucial since electronic equipment in a confined space generates much excess heat, and tends to malfunction if not adequately cooled. Air conditioning systems also help keep humidity within acceptable parameters. The humidity parameters are kept between 35% and 65 % Relative Humidity. Too much humidity and water may begin to condense on internal components; too little and static electricity may damage components.

ASHRAE recommends temperature range of 20-25 deg.C and humidity range of 40 - 60% as optimal for data center conditions.

  • Backup power is catered for via one or more uninterruptible power supplies and/or diesel generators.
  • To prevent single points of failure, all elements of the electrical systems, including backup system, are typically fully duplicated, and critical servers are connected to both the "A-side" and "B-side" power feeds. This arrangement is often made to achieve N+1 Redundancy in the systems. Static switches are sometimes used to ensure instantaneous switchover from one supply to the other in the event of a power failure.
  • Data centers typically have raised flooring made up of 60 cm (2 ft) removable square tiles. These provide a plenum for air to circulate below the floor, as part of the air conditioning system, as well as providing space for power cabling. Data cabling is typically routed through overhead cable trays in modern data centers. Smaller/less expensive data centers without raised flooring may use anti-static tiles for a flooring surface.
  • Data centers often have elaborate fire prevention and fire extinguishing systems. Modern data centers tend to have two kinds of fire alarm systems; a first system designed to spot the slightest sign of particles being given off by hot components, so a potential fire can be investigated and extinguished locally before it takes hold (sometimes, just by turning smoldering equipment off), and a second system designed to take full-scale action if the fire takes hold. Fire prevention and detection systems are also typically zoned, and high-quality fire-doors and other physical fire-breaks used, so that even if a fire does break out it can be contained and extinguished within a small part of the facility.
  • Using conventional water sprinkler systems on operational electrical equipment can do just as much damage as a fire. Originally Halon gas, a halogenated organic compound that chemically stops combustion, was used to extinguish flames. However, the use of Halon has been banned by the Montreal Protocol because of the danger Halon poses the ozone layer. Unlike fire extinguishing agents that displace oxygen, Halon did not pose a great risk to people caught in the data center when it was discharged. More environmentally-friendly alternatives include Argonite and FM-200, and even systems based on mists of tiny particles of ultra-pure water. There are also systems available which can control the gas mixture of the air so as to lower the oxygen content below the level at which combustion can take place but still high enough to support human life (similar to very high altitudes).
  • Physical security also plays a large role with data centers. Physical access to the site is usually restricted to selected personnel. Video camera surveillance and permanent security guards are almost always present if the data center is large or contains sensitive information on any of the systems.

 

Network infrastructure

Communications in data centers today are most often based on networks running the IP protocol suite. Data centers contain a set of routers and switches that transport traffic between the servers and to the outside world. Redundancy of the Internet connection is often provided by using two or more upstream service provider.

 

Some of the servers at the data center are used for running the basic Internet and intranet services needed by internal users in the organization: email servers, proxy servers, DNS servers, etc.

 

Network security elements are also usually deployed: firewalls, VPN gateways, Intrusion detection systems, etc. Also common are monitoring systems for the network and some of the applications. Additional off site monitoring systems are also typical, in case of a failure of communications inside the data center.

 

Applications

The main purpose of a data center is to run the applications that handle the core business and operational data of the organization. Such systems may be proprietary and developed internally by the organization, or bought from enterprise software vendors. Such common applications are ERP and CRM systems.

 

Often these applications will be composed of multiple hosts, each running a single component. Common components of such applications are databases, file servers, application servers, middleware and various others.

 

Data centers are also used for off site backups. Companies may subscribe to backup services provided by a data center. This is often used in conjunction with backup tapes. Backups can be taken of servers locally on to tapes., however tapes stored on site pose a security threat and are also susceptible to fire and flooding. Larger companies may also send their backups off site for added security. This can be done by backing up to a data center. Encrypted backups can be sent over the internet to data center where they can be stored securely.

 

Energy consumption in data centers

Due to rapid advance in computing, particularly blade servers and high speed switching, power consumption in data centers began to rise rapidly since 2005. This has caused the U.S. Congress to issue Public Law 109-431, requesting EPA to assess the current situation regarding energy use in U.S. data centers by the end of July 2007. On 2 August 2007, EPA published its findings in "EPA Report to Congress on Server and Data Center Energy Efficiency" [1] showing that servers and data centers accounted for 61 billion kWh in the year 2006 (about 1.5% of total US electricity usage), and this figure is expected to rise to more than 100 billion kWh by 2011 if present trends continue. Overall, in the U.S., computer use has been estimated to consume 9.4% of total electricity, with the bulk of consumption coming from PCs and monitors, and substantial additional portions from data centers and networking equipment.