DATA CENTER
PERFORMANCE INDEX
A new proposed data center rating system to help owners, operators, providers and site
selection professionals measure actual data center performance.
Distribution: Public
Released: April 20, 2017
Version: 1.4.2
PAGE 2
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
ABOUT IM THOUGHTS
IM Thoughts are papers written by members to share problems, concepts and ideas. Our End
User and Partner Members have unique perspectives and insight built from their extensive
experience. The intent of these papers is to stimulate discussion and debate to help advance the
industry.
Content is owned by the author(s) who assumes all liability from opinions expressed within the
papers. http://imasons.org/pubs/thoughts
CREDITS
AUTHOR:
Dean Nelson, iMasons Founder & Chairman, 3rd Degree Master
CONTRIBUTORS / REVIEWERS:
Eddie Schutter, iMasons End User Advisory Council, 3rd Degree Master
Peter Gross, iMasons Partner Advisory Council, 2nd Degree Master
Rob Roy, iMasons End User Advisory Council, 3rd Degree Master
Mark Monroe, Executive Director, iMasons, 1st Degree Master
Jan Weirsma, 1st Degree Master
Richard Donaldson, 2nd Degree Fellow
Sarah Keller, 1st Degree Fellow
Bob Culver, 2nd Degree Journey
Winston A. Saunders, iMasons Education Workgroup Chair, 2nd Degree Journey
Jim Weinheimer, End User Member
PAGE 3
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
IMASONS LOCAL EDITION CONTRIBUTORS
The release of the DCPI Thoughts paper went through three members only working sessions in
California, Colorado, and New York. The following members provided valuable feedback.
CALIFORNIA LOCAL EDITION CONTRIBUTORS
Kelly Aaron, Kevin Ansell, James Alvers, Gabriella Ascione, Rajendran Avadaiappan, Mike Bangoli,
Charles Bennington, Smarak Bhuyan, Melanie Bird, Andreas Bovopoulos, Jack Bray, Jessica Brooks,
Chris Brown, Maricel Cerruti, Michael Coleman, Dennis Cronin, Pei Chang, Cindy Choboian, James Coe,
Darin Daskarolis, Ankush Dham, Richard Donaldson, Bill Dougherty, Erik Eeg, Brandon Ewing, Dan
Ephraim, Will Eggleston, Hazem Elmeleegy, Tom Elowson, Azlan Ezaddin, Cary Frame, Louis Frost,
Tony Greenberg, Peter Gross, Anoop Grover, Peter Harrison, Jeffrey Hodge, Kris Holla, Steve Holland,
Korey Hyde, Sridhar Iyer, Sherrie Jackson, Kimble Jarrold, Oliver Jones, Sarah Keller, Mukesh Khattar,
Dheeraj Khanna, Vu Le, Stan Levenstone, Diana Li, Nelson Luhrsen, Tommy Nalley, Calvin Nicholson,
Charles McBride, Mark Monroe, Bruce Myatt, Dean Nelson, Conleth O’Flynn, Henrique Oliveira, Walt
Otis, Joe Parrino, Kirill Pertsev, Nick Peterson, Phil Reese, Tarun Raisoni, Shankar Ramamurthy, Sean
Roberts, Sam Rudek, Igor Runets, Dave Runyon, Jamie Saguindel, Anthony Salinas, Winston
Saunders, Alex St. John, Greg Stover, Eric Stromberg, Andrew Thompson, Jathin Ullal, Alec Valencia,
Jon Vanhoose, Spencer Viernes, Herb Villa, Jan Wiersma, Patrick Yantz
COLORADO LOCAL EDITION CONTRIBUTORS
Rachel Barrett, Cindy Choboian, Peter Citarella, Brandon Day, Steve Gaede, Jeremy Gigliotti, Doug
Hodges, Doug Kindig, Ernie Krauth, Rob McClary, Matthew Mescall, Dean Nelson, Kristen Sanderson,
Shawn Tugwell, Eric Woodell
NEW YORK LOCAL EDITION CONTRIBUTORS
Alfonso Aranda Arias, Raj Avadaiappan, Joshua Bonaventura-Sparagna, Dany Bouchedid, Ben
Cammarata, Jakob Carnemark, Dennis Cronin, Terence Deneny, Terence Deneny, Marc Donner, Joe
Dornetto, Joshua Feldman, Daniel Gaffney, Svein Atle Hagaseth, Robert Ioanna, Gordon Kellerman,
Gerry Lagro, Michael Lahoud, Steve Lerner, Bill McHenry, Mark Monroe, Tommy Nalley, Dean Nelson,
Julius Neudorfer, Duncan Ng, Shlomo Novotny, John O’Connor, Alex Para, Christian Pastrana, Peter
Sacco, Anthony Salinas, David Schirmacher, Zachary Smith, Andrew Stevens, Christopher Trapp, Todd
Traver, Ron Vokoun, Jim Weinheimer, Daniel Youssef, Edward Zemaitis
PAGE 4
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
ABOUT INFRASTRUCTURE MASONS
Infrastructure Masons is a group of industry professionals who design, build and operate the
technical infrastructure of the digital age. The IM community is where professionals connect,
grow and give back.
iMasons was established in April of 2016, achieving 501(c)(6), non-profit status in September of
2016. Membership levels are established based on criteria in three categories - Experience,
Economics and Stewardship. iMasons is operated by the Board and guided by the End User and
Partner Advisory Councils consisting of leaders of some of the largest and most advanced
technical infrastructure portfolios in the world. As of April 2017, membership exceeded 1,200
individuals representing over $100Bn in Infrastructure projects in over 130 countries.
http://imasons.org.
ABOUT FOUNDING PARTNERS
iMasons efforts are funded by membership dues, events, corporate donations, and Founding
Partner Sponsorships. As of this publication iMasons’ Founding Partners include
PAGE 5
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
TABLE OF CONTENTS
ABOUT IM THOUGHTS..................................................................................................................................... 2
CREDITS .......................................................................................................................................................... 2
AUTHOR: ....................................................................................................................................................... 2
CONTRIBUTORS / REVIEWERS: .................................................................................................................. 2
iMASONS LOCAL EDITION CONTRIBUTORS ............................................................................................... 3
California Local Edition Contributors ......................................................................................................... 3
Colorado Local Edition Contributors .......................................................................................................... 3
New York Local Edition Contributors ......................................................................................................... 3
ABOUT INFRASTRUCTURE MASONS .............................................................................................................. 4
ABOUT FOUNDING PARTNERS ....................................................................................................................... 4
INTRODUCTION ............................................................................................................................................... 7
NEW CARS ................................................................................................................................................... 7
USED CARS .................................................................................................................................................. 7
HOMES ......................................................................................................................................................... 7
RESTAURANTS ............................................................................................................................................. 7
LACK OF A DATA CENTER PERFORMANCE RATING SYSTEM ....................................................................... 9
PROPOSAL ..................................................................................................................................................... 10
AVAILABILITY ................................................................................................................................................. 11
AVAILABILITY IMPACT ................................................................................................................................ 11
AVAILABILITY Index Scores ....................................................................................................................... 11
EFFICIENCY ................................................................................................................................................... 13
EFFICIENCY IMPACT .................................................................................................................................. 14
EFFICIENCY Index Scores ......................................................................................................................... 14
ENVIRONMENTAL ......................................................................................................................................... 15
ENVIRONMENTAL IMPACT ........................................................................................................................ 15
ENVIRONMENTAL Index Score ................................................................................................................. 15
SCOPE & FREQUENCY ............................................................................................................................... 16
PAGE 6
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
VALIDATION & CONTESTING PERFORMANCE AWARDS .......................................................................... 16
APPLICABILITY ........................................................................................................................................... 17
MINIMUM MEASURES ............................................................................................................................... 17
FUTURE CATEGORY CONSIDERATIONS.................................................................................................... 18
OTHER CONSIDERATIONS......................................................................................................................... 18
CONCLUSION: ............................................................................................................................................... 19
FURTHER CONSIDERATIONS ....................................................................................................................... 20
APPENDIX ...................................................................................................................................................... 22
CARFAX....................................................................................................................................................... 22
DEPARTMENT OF HEALTH ........................................................................................................................ 23
CONSUMER REPORTS............................................................................................................................... 24
NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION....................................................................... 25
HOMETRACKR ........................................................................................................................................... 26
ABOUT THE AUTHOR ..................................................................................................................................... 27
REFERENCES ................................................................................................................................................ 28
PAGE 7
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
INTRODUCTION
When you choose a restaurant, what criteria do you use? For me it usually includes the selection
and quality of the food, the wait time to be seated, the atmosphere, the friendliness and
attentiveness of the serving staff, and of course the cost of the meal. This opinion is usually
formed through recommendations by friends or sites like Yelp and Zagat. This got me thinking.
NEW CARS
Would you buy a new car by just reading the sticker on the window? No, you would go Consumer
Reports automotive product reviews page and see the ratings based on their 10 criteria. You
might also go the NHTSA (National Highway Traffic Safety Administration) ratings page to see
how that car performs based on their four criteria.
USED CARS
Would you buy a used car by just reading the listing description? No, you would go to CarFax
and run the VIN number to see how it performs across their 10 criteria.
HOMES
Would you buy a house just by reading the listing description? No, you would go to Hometrakr
and check the address against their 8 performance criteria. You would also require inspections
before you close on that house.
RESTAURANTS
Would you choose a restaurant just by reading a review? Ahh, yes…
The difference here is subjective versus objective decision making. We all have preferences on
what we like and want (subjective), but most of the time we use facts to finalize that decision
(objective). For example, buying a car or a house can impact your safety and/or your budget. You
PAGE 8
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
know that if you don’t consider the facts when making the decision, it will cost you. Then why isn’t
the same logic followed when selecting restaurant? Eating at a restaurant can impact your health.
If it isn’t clean it will can make you sick and in some cases kill you.
The difference? Most people assume that the restaurant has clean and healthy practices. They
assume that there are people who check that. Matter of fact, there are. The health departments
in most cities and counties are required to do regular inspections of establishments serving food,
posting those results publicly through a grading system.
Unfortunately, 99% of the consumers don't look at that info.
I don't know about you, but there have been many times that I have had food poisoning because
of what or where I ate. By then, it's too late, you're already sick. You may choose not to go there
again, but in essence the damage is done.
The point is that in each of these analogies there is a standard way to measurement
performance. This is not the case for Data Centers.
PAGE 9
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
LACK OF A DATA CENTER PERFORMANCE RATING SYSTEM
Let’s apply the restaurant analogy here. If you choose the wrong restaurant, you will impact your
body, making you sick. If you choose the wrong data center, you will Impact your business,
impacting customers and potentially revenue. I would venture to say that most of us old timers in
the data center industry have experienced our share of outages. From small glitches to
catastrophic faults causing millions in loss and damage to the brand. The old adage is true. You
can’t truly understand it unless you've lived it.
In December of 2016 I spent an afternoon with Eddie Schutter, an IM Advisory Council member,
discussing and debating this topic. We agreed that while some standards exist, they don’t
represent the needs of today’s infrastructure professionals.
There is no standard way to objectively evaluate the performance of a data center
portfolio including owned and leased locations.
To compound this problem, most data center site selections are based on checklists written by
procurement teams or consultants. They verify expected performance, not actual performance.
Throughout our careers we’ve both contracted “Tier IV” data centers. Some had perfect
performance achieving 100% availability, but others guaranteeing 99.9999%, delivered far less.
While there were financial penalties in the contracts for not meeting the availability SLA, it was
pennies compared to the revenue impact from a single second of power loss. Bottom line, there
is no standardized system that can objectively rate the actual performance of data centers.
This brings us to the point of this paper.
PAGE 10
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
PROPOSAL
The iMasons Data Center Performance Index is based on the annual performance of a data
center in three primary categories - Availability, Efficiency and Environmental. Data Center facility
age will also be required for completeness. Initial roll out of DCPI is targeted at Colocation
facilities. The belief is that this system could also be utilized by cloud and enterprise data centers.
DATA CENTER PERFORMANCE INDEX
A B C D NR
AVAILABILITY
100% 0 incidents 0 seconds
99.9999% 1 incident or
< 32 sec
99.999% <= 2 incidents or
< 316 sec
99.99% <= 3 incidents or
< 3,156 sec
< 99.99% >= 3 incidents or
> 3,156 sec
EFFICIENCY
PUE & WUE 95th percentile
@ >50% or <50% IT Load
by Climate Zone
PUE & WUE 90th percentile
@ >50% or <50% IT Load
by Climate Zone
PUE & WUE 80th percentile
@ >50% or <50% IT Load
by Climate Zone
PUE & WUE 70th percentile
@ >50% or <50% IT Load
by Climate Zone
PUE & WUE < 70th percentile
@ >50% or <50% IT Load
by Climate Zone
ENVIRONMENTAL
GHG 0 metric tonnes CO2E
per MWh
GHG XX-XX metric tonnes
CO2E per MWh
GHG XX-XX metric tonnes
CO2E per MWh
GHG XX-XX metric tonnes
CO2E per MWh
GHG > XX metric tonnes
CO2E per MWh
PAGE 11
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
AVAILABILITY
Availability is defined as “the quality of being able to be used”. Data Centers are built to house,
power, cool, and connect IT equipment. Loss of power, cooling or fiber network connectivity that
causes an interruption of service on this equipment is considered downtime. The measurement of
this downtime is what constitutes an availability score in the DCPI index.
AVAILABILITY IMPACT
When there is a disruption to both power feeds, thermal overload in an area, or concurrent
network signal loss in a data center, the IT equipment services stop. A single second of downtime
in the data center, and the resulting service recovery time for the IT equipment, can cause
millions of dollars of lost productivity and revenue to data center customers.
AVAILABILITY INDEX SCORES
Data Center availability is measured by the number of incidents in an availability zone or the
corresponding number of seconds of loss of service in that calendar year. The rating is applied to
the largest number value. Example: 3 incidents with a total of 20 seconds of interruption is a D
rating. Any data center with more than 4 outages or < 99.9% availability do not qualify for DCPI
rating for that calendar year.
AVAILABILITY INDEX
RATING INCIDENT
COUNT SECONDS OF SERVICE
INTERRUPTION EQUIV 9’s
A 0 0 100%
B 1 < 32 99.9999%
C <= 2 < 316 99.999%
D <= 3 < 3,156 99.99%
NR > 3 > 3,156 < 99.99%
PAGE 12
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
• IT Gear = Customer machinery and equipment that is dependent upon power, cooling and
connectivity within an availability zone.
o Note: Gear hat does not have redundant power or connectivity connections is not
considered within this performance index. Example; Single corded IT gear being
impacted by scheduled maintenance on its only power feed.
• Availability Zone = Dedicated power, cooling and network connectivity shared by IT gear
in a physically defined area of the data center.
o AZ Example: A building that is served by a common cooling plant, generator/UPS
systems and WAN connectivity. Each of these systems must have at least N+1
redundancy. An impact to any of these redundant systems that causes IT gear
service impact is an impact to that Availability Zone. Incident Examples: Loss of
simultaneous power to IT gear. Loss of continuous cooling causing thermal
shutdown of IT gear. Fiber cut causing loss of simultaneous network connectivity to
IT gear.
• Incidents = Any interruption to power, cooling or connectivity that causes a service
disruption to IT gear housed within an availability zone. Maintenance windows are NOT
excluded from this measure. Single corded devices are NOT included in incident counts.
PAGE 13
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
EFFICIENCY
Data Centers are built to power and cool IT gear. The efficiency in which they deliver this
capacity is measured in Power Usage Effectiveness (PUE) and Water Usage Effectiveness
(WUE) aligned with specific climate zone limitations. How the data center is designed and tuned
over time will directly influence the DCPI efficiency rating. DCPI efficiency measures the annual
result of these design and operational decisions.
Metric Equations
• PUE = Total Facility Energy / IT Equipment Energy
• WUE = Annual Water Usage / IT Equipment Energy Usage (liters/kilowatt-hour).
Climate Zones
CLIMATE ZONE TYPE
DESCRIPTION
1A & 1B Very Hot – Humid (1A), Dry (1B)
2A & 2B Hot – Humid (2A), Dry (2B)
3A & 3B Warm – Humid (3A), Dry (3B)
3C Warm – Marine
4A & 4B Mixed–Humid (4A), Dry (4B)
4C Mixed-Marine
5A, 5B, & 5C Cold-Humid (5A), Dry (5B), Marine (5C)
6A & 6B Cold-Humid (6A), Dry (6B)
7 Very Cold
8 Subarctic
PAGE 14
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
EFFICIENCY IMPACT
IT equipment housed in data centers consume significant amounts of energy. Customers are usually
charged rent for kW capacity and a kWh cost for power consumed. Power costs are usually multiplied
by the PUE efficiency of the data center. A data center with an average PUE of 1.4 will have higher
costs to customers than a data center with an average PUE of 1.3. This means the efficiency of the
data center has a direct impact on customer operating expenses. While water is not currently a
significant impact to data center costs, the liters of water consumed per kWh of IT load (WUE) has a
direct correlation to the data center’s cooling design efficiency.
EFFICIENCY INDEX SCORES
DCPI Efficiency ratings are based on the annual PUE and WUE performance of an availability zone at
>50% or <50% IT load in their respective climate zone as defined in IECC/ASHRAE 90.4.
EFFICIENCY INDEX
RATING IT
Load
CLIMATE ZONE
1A/B 2A/B 3A/B 3C 4A/B 4C 5A/B 6A/B 7/8
PUE WUE PUE WUE PUE WUE PUE WUE PUE WUE PUE WUE PUE WUE PUE WUE PUE WUE
A 95th %
>50% TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD
<50% TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD
B 90th %
>50% TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD
<50% TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD
C 80th %
>50% TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD
<50% TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD
D 70th %
>50% TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD
<50% TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD TBD
NR <70th Percentile
PAGE 15
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
ENVIRONMENTAL
Data Centers consume large amounts of resources and generate numerous waste streams that
can be detrimental to the environment.
ENVIRONMENTAL IMPACT
Data centers are the digital factories of the 21st century. IT equipment housed in these data centers
consume significant amounts of energy and water resulting in an annual Green House Gas emission
value. The choices made by data center providers on how they source energy will have a direct impact
to the GHG emissions attributed to that data center customer in that calendar year.
ENVIRONMENTAL INDEX SCORE
Phase 1 DCPI Environmental ratings will be based on scope 2 Green House Gas (GHG)
emissions measured in Metric tonnes of CO2E per megawatt hour of consumption within each
availability zone in a given year, as defined by the Carbon Disclosure Project (CDP).
Future environmental considerations could include Renewable Energy Intensity, Net New Grid
Energy, Reclaimed Water Usage, Recycling, and Waste byproducts (examples: lead acid,
sulfuric acid, salt, magnesium, and chlorine).
ENVIRONMENTAL INDEX
RATING CARBON INTENSITY Metric tonnes Scope 2 CO2E/MWh
A 0 CO2E per MWh
B XX-XX CO2E per MWh
C XX-XX CO2E per MWh
D XX-XX CO2E per MWh
NR < XX CO2E per MWh
PAGE 16
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
SCOPE & FREQUENCY
• Ratings are applied to an availability zone with a minimum of one data center building or
enclosure. It can include more buildings if they share common mechanical and electrical
support systems.
• Sites can receive an overall performance rating if all of the buildings or enclosures perform
at the same level.
• Buildings can score in one or more categories. For example, a data center can achieve
Class A Efficiency, Class B Availability and Class C Environmental for 2016. In 2017, they
could achieve Class A Efficiency and Availability and Class C Environmental based on
their performance.
• Official DC Performance Ratings would be awarded annually based on the prior calendar
year performance. Once the baseline year is established, monthly updates can be
provided on the progress towards the current year’s rating.
VALIDATION & CONTESTING PERFORMANCE AWARDS
• Availability, Efficiency and Environmental scores will be based on content provided by the
data center owner/operators.
• The system will allow customers of a data center being evaluated to report availability
impacts during a given year.
• The system should allow the data center providers a process to contest claims if they
deem them inaccurate.
• We are considering two methods to validate annual data center performance ratings. First,
is to utilize independent, external auditors to validate claims before they are officially
awarded. Second is to require colo executives to provide Representation and Warranty
letters guaranteeing the validity of the performance measures provided.
• Once the ratings have been established, a 30 day wait period will be implemented to allow
the provider or customers to supply additional information to secure the final rating.
PAGE 17
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
APPLICABILITY
• It is important to note that the performance rating does not depend upon the data center
resiliency design. For example, a data center designed with N+1 redundancy could
achieve years of Class A level performance by effectively maintaining the facility. In
contrast, a 2N+1 redundant design could achieve a Class C rating if it is not maintained
appropriately. This rating system cares about actual performance, not expected
performance.
• Data Center Availability, Efficiency and Environmental claims are already marketed by
many providers through their websites and brochures. The goal of this rating systems is to
provide objective criteria to evaluating the data center’s actual performance and share
those scores with potential customers.
• In the case of the data center performance index, historical performance is an indicator of
future performance. Proper operations is the key contributor to facility uptime. This
reference is an important consideration for customers. For example: Does the providr have
a handle on human error, predicted or preventative maintenance, are they upgrading to
meet standards?
• Thorough data center commissioning is a critical step in achieving high availability. The
lack of proper commissioning will ultimately impact availability grades, but may not be
seen for many years. While commissioning is not considered in the DCPI grade, it is a
factor customers should consider when making their selection.
MINIMUM MEASURES
• Availability, Efficiency and Environmental categories were selected as the minimum key
performance indicators for all data center owners, operators and leasers.
• Cost was not included in the performance rating system. This system provides quantitative
performance measures that each customer will consider in their decision making. It should
allow them to make a value based decision based on actual performance and potential
risk associated with that data center.
PAGE 18
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
FUTURE CATEGORY CONSIDERATIONS
• Additional categories are being considered for future enhancements of the rating system.
• Risk: Natural and man-made disaster implications on the data center. Incidents caused by
tornado, floods, earthquakes, lightning strikes, terrorism, and other categories.
OTHER CONSIDERATIONS
• Members have asked why we don’t use the Energy Star for Data Centers as the efficiency
measure for the Data Center Performance Index. While the Energy Star program is an
effective way to measure and reward exemplary efficiency performance, it is using a
smaller set of data points with much higher PUE values. Their program document shows
PUE mean at 1.924, minimum of 1.362 and maximum of 3.598 from 61 facilities. We don’t
believe these are representative of the exemplary performance seen in data centers
deployed in the last 5 years. Energy Star for Data Centers also excludes WUE
performance and is a US standard only. The Data Center Performance Index is targeting
world-wide adoption.
PAGE 19
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
CONCLUSION:
Infrastructure Masons is proposing a new Data Center Performance Rating system focusing on
three key performance categories critical to data center customers – Availability, Efficiency, and
Environmental.
Each category measure is based on actual performance versus hypothetical performance. The
metrics are, by design, intended to be high level, easily interpreted by the customer, and simple
to implement by the data center provider. These ratings will provide a level of confidence in data
center performance that is currently unavailable.
Once formalized, the intent is to have our members request these rating scores in response to
their companies data center RFPs.
As with all iMasons thought papers, we look forward to your comments.
PAGE 20
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
FURTHER CONSIDERATIONS
Our belief is that less than 5% of companies can absorb a full data center fault without impact -
meaning loss of power, cooling or network to an availability zone. These more advanced
companies usually build and manage their own software stack and facilities (Examples: Amazon,
Google, Microsoft, Facebook, etc). What they put in colocation data centers is already software
resilient. It is usually driven by three primary factors - unplanned growth, placed for location
specific performance, or to minimize investment risk in less established markets. It’s important to
note that even if these big players choose a higher availability colocation data center in these
locations, they still expect wholesale prices.
This means that the remaining 95% of customers cannot absorb a data center fault. Less than a
second of downtime in a data center building could significantly impact their applications and
hence their business. This means that before they move into a colocation facility, they would like
to know their risk. Ideally, how the data center has performed in the past is a good indicator of the
future. Ultimately, they need to balance their risk tolerance against cost. Having the rating system
to show them the designed availability against the actual availability makes that value decision
much easier. It also removes the marketing fluff. A data center designed with high availability,
may not have achieved it. Unless you know someone who has had an outage in that data center,
or the providers disclosed it (highly unlikely), you are walking in blind. I have experienced both
sides of this coin. Over the last 5 years, two data center providers achieved perfect performance
(100% availability), while others significantly underperformed. If I had the performance data for
these data centers, I may have made different decisions or required more scrutiny ensuring that
outages in the past had been properly mitigated before we signed a contract. Bottom line, I would
know what I’m getting into.
Over the last few months, a number of our members have provided additional thoughts,
challenges and opinions about the Data Center Performance Index. Member feedback has been
PAGE 21
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
incorporated into the latest paper, but additional feedback and considerations are now captured
within our on-line content management system available to iMasons members.
PAGE 22
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
APPENDIX
CARFAX
Risk level rating system based on 10 categories; registration, titles, odometer readings, accident
history, frame/structural damage, accident indicators, service/repair info, usage and any recalls.
PAGE 23
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
DEPARTMENT OF HEALTH
Letter grade system based on overall point scores in six categories – Food handling, food
temperature, personal hygiene, facility and equipment maintenance and vermin control. Each
violation earns a certain number of points based on three levels (or weighted violations) – Public
Health Hazard (7 points), Critical Violation (5 points) or General Violation (2 points). 0 to 13
points earns an A, 14-27 points earns a B and 28 or more points earns a C.
PAGE 24
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
CONSUMER REPORTS
Performance and opinion based performance rating across ten categories with more than 50 subcategories in each – Road Test, Predicted Reliability, Owner Satisfaction, Overview, Safety,
Performance, Comfort, Fuel Economy, Specification, Warranty.
PAGE 25
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
NATIONAL HIGHWAY TRAFFIC SAFETY ADMINISTRATION
Performance Based Star safety rating across four categories; Frontal Crash, Side Crash,
Rollover and Overall Rating.
PAGE 26
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
HOMETRACKR
Performance based rating based on Fire/Water/Pest Damage, Maintenance Alerts, Home
Values, Permit Records, Inspections and Contractors that worked on that home.
PAGE 27
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
ABOUT THE AUTHOR
Dean Nelson is the Founder and Chairman of Infrastructure Masons, an independent industry group of
executive and technology professionals entrusted with building and operating the physical and logical
structures of the Digital Age.
Dean he has led over $7B in infrastructure projects in 9 countries. His extensive architecture, engineering and
operations experience includes 28 years in Hardware, 21 years in Network, 16 years in Infrastructure Software
and 16 years in Data Centers. He has produced numerous award winning innovations in mission critical
facilities and compute environments. He also holds four US patents.
Dean currently works at Uber as Head of Uber Compute. His team is responsible for technical infrastructure
(data center, compute, storage, network and infrastructure software) and business functions supporting
Uber’s global leading ridesharing business, as well as UberEats, UberRush, UberForBusiness, and autonomous
vehicle development. Prior to Uber, Dean worked at eBay Inc for 6 ½ years as the Vice President of Global
Foundation Services, which served over 300 million active users delivering over $250Bn of enabled commerce
volume annually. During the last two years at eBay, his team successfully integrated, then split eBay and
PayPal infrastructures into two independent internet companies. Prior to eBay, Dean worked at Sun
Microsystems for 17 years in various technical, management and executive leadership roles in Manufacturing,
Engineering, IT and Real Estate. His final project was the consolidation of Sun’s multi-billion dollar global
technical infrastructure portfolio of over 1,500 facilities.
Dean is creator of the Digital Service Efficiency methodology, the first miles per gallon measurement for
technical infrastructure, used to measure eBay.com as a single system. He served as the Chair of
the Technology Business Management Council High Tech Workgroup, and Chairman & Founder of Data Center
Pulse in 2009 – an exclusive datacenter owner community with over 9,000 members in 100 countries.
Dean founded Infrastructure Masons, an industry association where infrastructure professionals connect,
grow and give back. Dean was identified by SearchDatacenter.com as one of the top five people who changed
the data center. Dean is also the recipient of Sun Microsystem’s prestigious Innovation Award, Modular DC
Deployment award and Best DC Design award from Uptime Institute as well as the Operational
Excellence and Infrastructure Trailblazer awards from The TBM Council.
In his personal time Dean gives back by building schools and dorms to providing access to education for
impoverished children through his mother and son Just Let Me Learn Foundation. He also enjoys spending
time with his wife and performing with his daughter.
PAGE 28
©2017 Thought Authors & Infrastructure Masons, Inc. All rights reserved. No part of this publication may be used, reproduced, photocopied, transmitted,
or stored in any retrieval system of any nature without proper credits given to the copyright owner.
REFERENCES
1. U.S. Department of Housing and Urban Development, Overview of Healthy Home Rating System
(HHRS), and Worked Example
2. HomeTrackr – Knowing a homes history before you buy, is important.
3. Consumer Reports – Knowing a car’s performance before you buy is important.
4. CarFax Vehicle History Data Sources.
5. CarFax Vehicle History Report Example.
6. New York City Depart of Health and Mental Hygiene. Restaurant Inspection Information Search.
7. New York City Department of Health and Metal Hygiene Letter Grading System.
8. Digital Trends – The new service from Portch is like CarFax for houses.
9. Untappedcities.com – Cities 101: How does the NYC Restaurant Sanitation Grade Work?
10. Future of Internet Power Corporate Colocation and Cloud Buyers’ Principles
11. Energy Star for Data Centers, Energy Star Portfolio Manager.
12. ANSI/ASHRAE/IESNA Standard 90.1-2007 Normatie Appendix B – Building Envelope Climate
Criteria.
13. Google Environmental Report 2016.
14. ISO 14,000 Family of International Standards.