Upload
iceotopepr
View
144
Download
0
Embed Size (px)
Citation preview
The Big CrunchThe Future of Data Centres?12th March 2015 – Peter Hopton BEng, MBCS, FRSA - Founder and Chief Visionary Officer, Iceotope
Denser Is Coming
• Processor Efficiency Has Doubled Every 18 Months
• Data Transmission Efficiency Has Not
• IT Hardware Has Been Getting Denser, Data Centres Have Struggled to Keep Up
IT Hardware is Getting Denser
• Past – Packing A Rack to 5KW in Y2000 was unheard of.
• Present – off the shelf air cooled servers in a 48u cabinet: Up to 50kW – Uncoolable with air
• Future – in 3-5 years the densest motherboard will be 3”x5” and will use 350W
But DCs Struggle to Keep UpAverage Density is Very Low in Colocation
• Source: Uptime Institute Survey 2013
• Growth In Average Density – Evident But Much Slower than Server Growth
• Explanation: Colocation Business Model, Low Bandwidth Interconnect, Air Cooling & Energy Efficiency Drive
• Maybe DCs are Underpopulated?
What Does Average Density Look Like?
• Literally this, 2x2u high density units
• Imagine that in a 48u cabinet!
Why?Koomey’s Law
• IT Doubles in Efficiency Every 18 Months
• Remains Unbroken
And Interconnect?Struggling to Keep Up
• Copper Interconnect is “Maxed Out” – laws of physics.• Optical Interconnect is expensive, bulky and breaks even on energy at
50cm• Pico-watts per bit has increased from DDR3 to DDR4, rather than
decreased• IT Hardware vendors are moving RAM and storage closer to the CPU,
increasing Density.
Remember the 3” x 5”Denser is Coming
But Can We Keep on Spreading the Heat Out?10 little chips in a big 48u cabinet?
• Simply Put, Yes, But It Will be Expensive• The High Speed Interconnects Will All Have to Be Optical NOT Copper• Active Interconnects Will Use More Power and Cost More• Cost Economics of Dense Liquid Cooled Facilities Will Win Big.
The Commercial Opportunity In DensityDenser IT = More in the Same Facility
• Liquid Cooling Designs Already Offer 60kW Densities• Total Liquid Cooling Offers Reduced Infrastructure Costs• Total Liquid Cooling Offers Cooling PUE’s of 1.02• Making Servers Closer Reduces Interconnect Costs
Where’s This Going to Happen?
• HPC: Interconnect Speeds and Processing Intensities in HPC are Driving Density
• Cloud: Cloud and Virtualised Environments Will Follow – Why? • because they have high utilisations like HPC and high speed
connectivity is increasing.• Liquid Cooling is Already Key In New HPC Installs today
Who Are Iceotope?
• British Company based in Sheffield, UK• Backed By:• Schneider Electric• Solvay Specialist Plastics
• Additional partnerships with:• Intel• 3M
•>10,000,000 CPUcHours Of Use
Real World Results
• Up to 60kW Cabinet Density – 72 Blades Per Cabinet• Cooling Overhead (internal): as low as 0.33%• Cooling Overhead (external): 1.7% - 4% (pPUE 1.017-1.04
@30KW/cab)• Poznan National Supercomputing Centre indicated 1.7% (2% overall)
average• Romonet indicated 4% cooling overhead in 2N model in Houston Texas
• IT Energy Reduction vs Fan Cooled: 8%• Based on measurements at Leeds University (Chi et Al 2013)
• Overall performance per watt benefit vs equivalent servers in a high density, close coupled cooling environment: 40.8%• Based on studies at Leeds University (Chi et Al 2013)
Our Ethos: The Elimination Of Waste
1. Waste infrastructure required to support servers2. Waste power consumed in supporting the
electronics – and running the same electronics with greater speed and efficiency
3. Waste product – producing a viable waste product in the form of hot water
Air Conditioners
IT Cabinets AKA “Racks”Each contains 10-20 Servers
Raised Floor
Back Up Batteries AKA “UPS” Generators
x4
External Coolers
x8
Power Distribution
1. TLC needs no A/C
3.TLC servers have no fans – use less power
2. No Airflow, no raised floor
4. No A/C, average and peak power consumption reduced
x10
x4
x8
x78
x3
x0
5. At least 2x as many servers per Cabinet
x2
x2
x8x4
x3972
Iceotope’s Vision
PetaGen Blades (Common Blade Ecosystem)
The Iceotope Blade, is sealed, electronics are immersed
Exotic CoolantHigh Thermal Expansivity
Electrical Insulator
Certified non-flammableNon Ozone depleting
Clean & Safe
PetaGen – 72 Blade Cabinet
• Standard 800mmx1200mm footprint• Takes 72 Iceotope Blades• Zero Airflow• Deploy in “grey space” not “white space”• Just add (hot) coolant loop and power• Liquid Cooled 415V-48V Power Supplies• Fully redundant at 30kW• Maximum Capacity 60kW
Thanks For Listening@petehopton@Iceotope