When picking hosting provider one of the checklist items is to check his “home” data center and Tier level as it will tell you a lot about expected availability of your servers and virtual machines.
When it comes to data center certification, just ask the Uptime Institute. The Uptime Institute's Tier Classification for Data Centers has already “celebrated” its twentieth anniversary. Since its inception in the mid-1990s, levels have evolved from a common industry terminology to a global standard for validating critical data center infrastructure.
During this time, the industry has changed, and the Tiers-system has evolved with it, remaining as relevant and important as when the Uptime Institute first developed and published information about its classification system. At the same time, experts at the Uptime Institute note that society’s understanding of the system of levels has been overshadowed by many myths and delusions that have emerged over the years.
The Uptime Institute has long been aware that not everyone fully understands the ideas described in the Tier standards, and some do not agree with some of the definitions. Both of these situations lead to classic misunderstandings in which people make a choice in favor of their desires instead of obtaining accurate information.
However, in other cases, marketers refer to a kind of legend, based on a system of Tier levels. Fantasies, such as Tier III plus, used by advertisers in conversations with their potential customers, have no basis in classifying the Uptime Institute, but may be misleading arguments for procurement specialists, tenders, real estate, CFOs, and even IT staff, not having an appropriate technical knowledge base.
Other myths are spread because some industry professionals refer to outdated publications and explanatory materials that are no longer reflect current standards. There may be other sources of error, but to have an understanding that the Uptime Institute is the only source of reliable information about its classification system is really important.
Uptime Institute specialists conduct numerous classes throughout the year, write many articles and work on queries to keep industry representatives abreast of the current Tier classification.
Essentially, the Uptime Institute has created a tiered classification system for systematically evaluating various facilities and data center equipment in terms of potential infrastructure performance or uptime. The system consists of four levels; each level includes lower level requirements (tier).
The Uptime Institute is the only organization authorized to certify data centers in accordance with the Tier classification system. Uptime Institute does not design, build or operate data centers. The role of the Uptime Institute is to assess the infrastructure of the data center, assess its operation and management.
Guided by our experience, we have collected and reviewed many myths and delusions.
Incorrect. Tiers is a data center benchmarking system driven by business activity needs based on business performance indicators. The acceptable level of risk for an organization determines the “Tier” required for such a business. In other words, the level depends on the business model of a particular company. Companies that cannot clarify the need for performance and reliability for their facilities before determining the TIER class, misuse the classification and avoid the internal dialogue that needs to take place.
Incorrect.The acceptable risk level for an organization determines the appropriate Tier level. Tier IV is not the best option for absolutely all organizations, as well as Tier II. Owners of data centers must perform a comprehensive assessment (due diligence) of their objectives before determining the target tier level. If business goal is not defined and level is wrongly determined, it can incur unnecessary excessive investments.
Tier I and Tier II are tactical decisions, the adoption of which, often is more based on the initial cost and time to market, than on the real cost requirements during the life cycle and reliability (uptime). Organizations choosing Tier I and II, as a rule, do not receive the main revenue from selling of products or services online. Usually, these organizations are contractually protected from damage caused by IT system breakdown or outage.
Strict uptime requirements and long-term resiliency are generally strategic choices that match infrastructure Tier III and Tier IV standard. In Tier III data centers, each infrastructure component required to support an IT process can be turned off on a planned basis, without affecting critical environments or processes. Tier IV solutions are more reliable, since all components and delivery paths are completely duplicated and can withstand a failure, an error, or another unplanned event without harming critical environments or processes.
Tier IV is no better than Tier II. The performance and capabilities of the data center infrastructure must be consistent with business objectives; otherwise, companies are either overinvesting or taking too much risk.
For example, before building a Tier II data center, which by definition does not include parallel maintainability in all critical subsystems, the owner must consider the possibility of a business tolerating a planned or service-related shutdown and plan how operations team will coordinate this.
It is business goals that should be the reason for the decision to create a data center at certain Tier level.
Incorrect. Indeed, the first step is the certification of the project documents. The Uptime Institute consultants verify 100% of the project documentation, ensuring that all electrical, mechanical systems, monitoring systems and numerous automation subsystems are consistent with the fundamental concepts and there are no weak links in the circuit. Project certification should be an important milestone, allowing data center owners to start building a facility, knowing that the proposed project can meet the target Tier level.
The package of documents receives its own Tier-level, which is the result of a preliminary test prior to the Tier Certification of Constructed Facility. If Uptime Institute does not check the constructed object, then it cannot say whether everything was implemented in accordance with the project. To highlight this point, the Uptime Institute designates an expiration date on Tier Certification of Design Documents. Tier-level confirmations issued after January 1, 2014 expires two years after the award date.
During the certification of the data center, a team of Uptime Institute consultants travels to the site, identifying discrepancies between design drawings and installed equipment. Consultants monitor the tests and demonstrations to determine compliance with the level of fault tolerance. Essential certification value is the ability to find these blind spots and weak links in a chain. Uptime Institute consultants say that almost every time they visit a data center, they find that changes have been made to a project that has passed the Tier Certification of Design Documents, and that one or more systems or subsystems are not working as required for the selected Tier.
More recently, the Uptime Institute introduced Tier Certification of Operational Sustainability to evaluate the performance of data center operators and their ability to manage critical facility. Even data centers designed and built to meet the most fault-tolerant tier can suffer the consequences of accidents without well-designed comprehensive anti-crisis routines. Certification at all three levels is a way for data center owners to gain confidence that they maximize the potential and reliability of their facilities.
Incorrect. The Uptime Institute removed information about “expected downtime in a single year” from Tier Standard in 2009, as they were never part of the Tier-level definitions. The tier is decided by specific performance factors that demonstrate that an object has achieved specific goals, such as the presence of redundant components, parallel maintainability (usually the ability to remove any component or delivery channel on a planned basis, without affecting the operation of IT systems), or Fault tolerance (as a rule, the ability to continue working in case of any unplanned failures in the data center infrastructure). However, even a Tier IV data center that is Failsafe can experience downtime if it is not supported by trained personnel and comprehensive anti-crisis routines.
There are statistical tools to predict failure rates and recovery times. Availability is simply an arithmetic calculation of the time during which the data center has been available since the launch or for a specific period. The number, frequency and duration of failures will change the availability rate. But statistical tools should be used carefully. Human actions are often not taken into account in statistical model. In the statistical forecast of a centenary storm, for example, it may be omitted that several centenary storms may occur in one year.
Incorrect. According to the standard: topologically the only truly reliable source of power for the data center is the engine-generator set. This is due to the fact that the power supply can be suspended even in places with reliable electrical networks. As a result, the number of power supply inputs, substations, energy systems that ensure the supply of electricity to the objects does not predetermine and does not affect the choice of the Tier standard. As a result, electricity from the utility network is not even required for certification at any Tier level. Most certified data centers use such electricity for basic operations as an economic alternative, but this decision does not affect achievement of the target Tier-level during certification.