Upload
danielle-dawson
View
218
Download
2
Tags:
Embed Size (px)
Citation preview
Welcome To The 31th HPC User Forum
Meeting
October 16, 2008
Special Thanks To:
Imperial College LondonSimon Burbidge
Sue Pritchett
Special Thanks To:
Imperial College LondonSimon Burbidge
Sue Pritchett
Thank You To Our Sponsors:
AltairFujitsu
HPIBM
Panasas
Thank You To Our Sponsors:
AltairFujitsu
HPIBM
Panasas
Steve Finn And Steve Conway
HPC User Forum Update
Introduction: HPC User Forum MissionIntroduction: HPC User Forum Mission
Assist HPC users in solving their ongoing computing, technical and business problems
A forum for exchanging information, identifying areas of common interest, and developing unified positions on requirements
Provide members with a continual supply of information on:
Uses of high end computers, high end best practices, market dynamics, computer systems and tools, vendor activities and strategies
Provide members with a channel to present their achievements and requirements to outside interested parties
Introduction: HPC User Forum MissionIntroduction: HPC User Forum Mission
European Meeting Goal: Maintain a dialogue between US and European HPC users/buyers
• Recognizing that European market dynamics are not identical to U.S. market dynamics
• And that dynamics in Europe vary from country to country and region to region
Introduction: HPC User Forum Steering Committee Introduction: HPC User Forum Steering Committee
Steve Finn BAE SystemsChairman
Sharan Kalwani General Motors CorporationVice Chairman
Earl JosephIDC, Executive Director
Vijay AgarwalaPenn State University
Alex AkkermanFord Motor Company
Doug BallThe Boeing Company
Paul BuergerOhio Supercomputer Center
Steve Conway IDC Research Vice President
Jack CollinsNational Cancer Institute
James KasdorfPittsburgh Supercomputing Center
Doug KotheOak Ridge National Laboratory
Paul MuzioCity University of New York
Michael Resch HRLS, University of Stuttgart
Vince ScarafinoIndustry Expert
Suresh ShuklaThe Boeing Company
Robert SingleterryNASA/Langley
Allan SnavelySan Diego Supercomputer Center
Important Dates For Your CalendarImportant Dates For Your Calendar
HPC User Forum Meetings: October 13 and 14, 2008
– HLRS/University of Stuttgart October 16, 2008
– Imperial College London April 20 to 22, 2009
– The Hotel Roanoke & Conference Center, Roanoke, VA
September 8 to 10, 2009– Omni Interlocken Resort, Broomfield, CO
SC08 in Austin, Texas, November 15 to 21, 2008
ISC08, Hamburg, June 23 to 26, 2009
IDC HPC
Market Update
IDC’s HPC TeamIDC’s HPC Team
Earl Joseph IDC HPC research studies, HPC User Forum, and strategic consulting
Steve ConwayHPC User Forum, consulting, primary user research and events
Richard WalshIn-depth technical analysis, special studies, processor trends, and data center issues
Jie WuHPC research specialist, census and forecasts, China research, interconnects and grids
Lloyd CohenDirector Worldwide Market Analysis, data analysis, workstations
Beth ThrockmortonGovernment account support, special projects
Charlie Hayes Government HPC issues, DOE, and special studies
Mary Rolph Conference planning and logistics
©2007 IDC 10
Top Trends in HPCTop Trends in HPC
HPC continues to show strong growth 10% this Q 19% yearly growth over the last 4 years We are forecasting 9.2% growth for the next 5 years
Blades are making inroads into all segmentsMajor challenges for datacenters:
Power, cooling, real estate, system management Storage and data management continue to grow in
importance Software hurdles will rise to the top for most users
Driven heavily by multi-core processors and hybrid systems
Why Has Technical Computing Grown So Quickly? Why Has Technical Computing Grown So Quickly?
1. Price and price/peak performance of clusters has redefined the cost of technical computing
– >6x better than RISC, >70x better than vectors 2. At the same time, “live” science and “live” engineering
costs have escalated – Plus time-to-solution is months faster with simulations
3. Global competitiveness is driving R&D and better product designs
4. At the same time, x86 performance on technical applications is weak
– Driving buyers to purchase a much larger number of processors
5. New materials and approaches require rewriting the “books and tables” which takes years – making simulations a faster solution
2007 HPC Market Size By Competitive Segments 2007 HPC Market Size By Competitive Segments
Departmental ($250K - $100K)
$3.4B
Divisional ($250K - $500K)
$1.6B
Supercomputers(Over $500K)
$2.7B
Workgroup(under $100K)
$2.4B
HPC Servers
$10B
Vendor HPC Market Shares In 2Q08:All HPC SegmentsVendor HPC Market Shares In 2Q08:All HPC Segments
IBM26.8%
Other10.2%
Dawning0.1%
NEC0.6%
SGI2.1%
Cray1.3%
Dell15.7%
Sun5.4%
Bull1.1%
HP36.7%
Total HPC Revenue by Processor Type Total HPC Revenue by Processor Type
0%
20%
40%
60%
80%
100%
HP
C R
even
ue b
y P
rocesso
r T
yp
e
2000 2001 2002 2003 2004 2005 2006 2007
x86-64
x86-32
Vector
RISC
Prop
EPIC
Why Is Commodity Hot? .. Price!Why Is Commodity Hot? .. Price!
HPC All Servers Processor Summary, 2007
Average CPUs /System
System ASP($K) $/CPU
CPUs /$M
X86 18 $39 $2,151 465
RISC 9 $70 $7,869 127
EPIC 7 $65 $8,966 112
Vector 11 $605 $54,177 18
HPC
Cluster Update
0%
20%
40%
60%
80%
100%
120%
140%
160%
180%
2001 2002 2003 2004 2005 2006 2007
HPC Cluster Revenue Growth RatesHPC Cluster Revenue Growth Rates
Growth Has Averaged Over 74%/yr Since 2002
Growth In HPC ClustersGrowth In HPC Clusters
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
Q103
Q203
Q303
Q403
Q104
Q204
Q304
Q404
Q105
Q205
Q305
Q405
Q106
Q206
Q306
Q406
Q107
Q207
Q307
Q407
Cluster
Non-Cluster
Cluster Revenue Share by ProcessorCluster Revenue Share by Processor
0%
10%
20%
30%
40%
50%
60%
70%
80%
90%
100%
Clu
ster
Rev
enu
e S
har
e b
y P
roce
sso
r T
ype
2001 2002 2003 2004 2005 2006 2007
x86
EPIC
RISC
HPC Cluster Processor ShipmentsHPC Cluster Processor Shipments
New Competitive Segment 2003 2004 2005 2006 2007 CAGR
cs1-Supercomputer 157,384 178,982 277,224 375,553 535,338 36%
cs2-Divisional 96,929 241,254 367,002 516,137 610,142 58%
cs3-Departmental 189,091 273,344 968,013 1,545,991 1,442,150 66%
cs4-Workgroup 209,980 406,452 229,601 189,048 224,974 2%
Grand Total 653,384 1,100,032 1,841,840 2,626,728 2,812,604 44%
HPC cluster processors are now shipping at a rate of over 2.8 million a year
Average yearly growth has been 44%
HPC Market
Forecasts
HPC Market
Forecasts
HPC Forecast: Strong Growth Over Next Five Years ($ Millions)HPC Forecast: Strong Growth Over Next Five Years ($ Millions)
2007 2012 CAGR
Supercomputer $2,682 $3,512 5.5%
Technical Divisional $1,610 $3,092 13.9%
Technical Departmental $3,384 $5,763 11.2%
Technical Workgroup $2,400 $3,193 5.9%
Total $10,076 $15,617 9.2%
Source: IDC, 2008
HPC Application Forecast, 2007 - 2012 HPC Application Forecast, 2007 - 2012
HPC Application Segment 2007 2012 5-yr CAGR
Bio-Sciences $1,558,368 $2,454,715 9.5%
CAE $1,268,038 $2,321,580 12.9%
Chemical Engineering $259,506 $367,561 7.2%
DCC & Distribution $585,391 $1,081,443 13.1%
Economics/Financial $305,325 $510,675 10.8%
EDA $717,481 $931,569 5.4%
Geosciences and Geo-engineering $589,343 $1,001,070 11.2%
Mechanical Design and Drafting $139,851 $262,105 13.4%
Defense $917,577 $1,413,607 9.0%
Government Lab $1,376,058 $1,657,796 3.8%
University/Academic $1,858,705 $2,764,522 8.3%
Weather $399,228 $732,349 12.9%
Other $101,559 $118,179 3.1%
Total Revenue $10,076,430 $15,617,170 9.2%
Summary Thoughts Summary Thoughts
Major Customer Pain PointsMajor Customer Pain Points
Clusters are still hard to use and manage System management & growing cluster complexity Power, cooling and floor space are major issues Third party software costs Weak interconnect performance at all levels Applications & programming — Hard to scale
beyond a node RAS is a growing issue Storage and data management are becoming new
bottle necks Lack of support for heterogeneous environment and
accelerators
Major Customer Pain PointsMajor Customer Pain Points
Software is becoming the #1 roadblock Better management software is needed
– HPC clusters are hard to setup and operate – New buyers – require “ease-of-everything”
Parallel software is lacking for most users– Many applications will need a major redesign – Multi-core will cause many issues to “hit-the-wall”
Software – ISV Scaling LimitationsSoftware – ISV Scaling Limitations
TABLE 20
Typical Number of Processors the ISV Applications Use for Single Jobs
CPU Range Number of Applications Percent
1 19 24.4%
2-8 25 32.1%
9-32 20 25.6%
33-128 9 11.5%
129-1024 4 5.1%
Unlimited 1 1.3%
Total: 78 100.0%
New Challenges Affecting IT DatacentersNew Challenges Affecting IT Datacenters
The increase in CPUs and server units is creating significant IT challenges in:
Managing complexity
– How to best manage a complex cluster
– How to install/setup a new cluster without having to buy a large number of separate pieces
Power/cooling and Space Application scaling and hardware utilization
– How to deliver strong performance to users on YOUR applications
– How to make optimal use of new processor and system designs
Agenda: Thursday Morning Agenda: Thursday Morning
9:00 Imperial College Welcome9:15 HPC User Forum Welcome/Introductions, Steve
Finn and Steve Conway 9:30 Jamil Appa, BAE Systems, HPC in Aerospace10:00 Doug Ball, Boeing, HPC Trends in Aerospace 10:30 IBM Technology Update10:45 Isabella Weger, ECMWF, HPC and Weather
Prediction11:15 Break11:30 Gerard Gorman, Imperial College, Software
Engineering & Support12:00 Dr. Frank Baetke, HP, HP's Scalable Computing
Strategy12:15 Lunch
Lunch Logistics Lunch Logistics
WelcomeTo The 31th
HPC User ForumMeeting
Agenda: Thursday Afternoon Agenda: Thursday Afternoon
13:15 Panasas Technology Update13:30 Terry Hewitt, EDS, Automotive Work for Rolls-Royce14:00 Fujitsu R&D Technology Update, Motoi Okuda14:30 Andrew Jones, NAG, HPC Trends and Issues14:45 Peter Haynes, Imperial College, Materials and Physics 15:15 Vince Scarafino, HPC User Forum HPC Technology Panel
Results15:45 Irene Qualters, SGI, Industrial Strength Linux16:15 Break 16:30 Mark Parsons, EPCC Site Update and HECToR Program17:00 Bill Butcher, Altair Engineering, Update17:15 HPC trends at NASA, Robert Singleterry17:45 Wrap up and plans for future HPC User Forum meetings, Steve
Conway and Steve Finn18:00 Guided Tour of Imperial College
Important Dates For Your CalendarImportant Dates For Your Calendar
HPC User Forum Meetings: October 13 and 14
– In Stuttgart, Germany October 16
– At the Imperial College, London April 20 to 22, 2009
– The Hotel Roanoke & Conference Center, Roanoke, VA
September 8 to 10, 2009– Omni Interlocken Resort, Broomfield, CO
Supercomputing08 in Austin, Texas, November 17 to 21, 2007
ISC08, Hamburg, June 23 to 26, 2009
Thank YouFor Attending The 31th
HPC User ForumMeeting