Upload
truongnguyet
View
224
Download
4
Embed Size (px)
Citation preview
HP-EMEA May 20, 2003 Lennart Johnsson
GridsDrivers (Applications) and Challenges
Lennart Johnsson
Department of Numerical Analysis and Computer ScienceDirector, PDC
Royal Institute of Technology, Stockholmand
Department of Computer ScienceDirector Texas Learning and Computation Center
University of Houston, Houston
HP-EMEA May 20, 2003 Lennart Johnsson
What are Grids?
• A collection of networked resources with a shared software environment providing– A common access mechanism– Tools and services for assembling and using
distributed resources– Tools and services for managing the infrastructure– Security (authentication, authorization) and integrity– Accounting services
HP-EMEA May 20, 2003 Lennart Johnsson
Grid Evolution
Courtesy of Ian Foster
HP-EMEA May 20, 2003 Lennart Johnsson
Grids
“We will perhaps see the spread of ‘computer utilities’, which, like present
electric and telephone utilities, will service individual homes and offices
across the country”
(Len Kleinrock, 1969)
HP-EMEA May 20, 2003 Lennart Johnsson
• Early 90s– Gigabit testbeds, metacomputing
• Mid to late 90s– Early experiments (e.g., I-WAY ‘95), academic software
projects (e.g., Globus), applications• 2003
– Dozens of application communities & projects– Significant technology base (Globus ToolkitTM)– Global Grid Forum: ~500 people, 20+ countries
The Grid: A Brief History
Slide courtesy Ian Foster
HP-EMEA May 20, 2003 Lennart Johnsson
GUSTO tesbed for Grid applications demonstrated at Supercomputing97 exhibition
Globus ’97
HP-EMEA May 20, 2003 Lennart Johnsson
LHC - A Grid Application
1800 Physicists, 150 Institutes, 32 Countries
50 - 100 PB of data by 2010; 50,000 CPUs?
HP-EMEA May 20, 2003 Lennart Johnsson
les robertson - cern-it-09-00 10 last update 2000-11-27 18:33CERN
The LHC DetectorsCMS
ATLAS
LHCbRaw recording rate 0.1 – 1 GB/sec
3.5 PetaBytes / year~108 events/year
LHC
HP-EMEA May 20, 2003 Lennart Johnsson
Next Step-Allen Telescope Array
Larry SmarrComputer Science
& Engineering
SETI@Home
HP-EMEA May 20, 2003 Lennart Johnsson
Grids and IndustryEarly Examples
Butterfly.net: Grid for multi-player
games
Entropia: Distributed computing (BMS, Novartis, …)
HP-EMEA May 20, 2003 Lennart Johnsson
Casino-21: Large Scale MonteCarlo Climate Simulations
• Ensemble Computing Varying Model Parameters• Evaluate Model Against Current Climate• Home in on Most Realistic Models by Natural
Selection• Then Model 21st Century Climate Evolution• One Climate Model per PC• Currently About 20,000 PCs
Rutherford Appleton Laboratory
www.climate-dynamics.rl.ac.uk/index.html
Larry SmarrComputer Science
& Engineering
HP-EMEA May 20, 2003 Lennart Johnsson
Gordon Moore, 1965
Electronics, Vol 38, no 8, April 19, 1965
19711959
HP-EMEA May 20, 2003 Lennart Johnsson
NSFnetvBNS
Internet2 AbileneTeraGrid
0
2000
4000
6000
8000
10000
1986 1996 1997 1999 2001
Year
Mbi
t/s
OC-3 OC-12
OC-48
OC-192Doubling every year
Vancouver
SeattlePortland
San Francisco
Los Angeles
San Diego
(SDSC)
NCSA
Chicago NYC
SURFnetCA*net4
AMPATH
PSC
Atlanta
IU
U Wisconsin
DTF 40Gb
NTON
NTON
HP-EMEA May 20, 2003 Lennart Johnsson
HPN
DoE EnergySciences Network
NASA Research and Education
Network (NREN)
HP-EMEA May 20, 2003 Lennart Johnsson
GEANT
HP-EMEA May 20, 2003 Lennart Johnsson
HPNNordic
NetworksAlliance ’98 Infrastructure Impact-75 fold capacity increase
over 40 months-UCAID and Eurolink
agreements
HP-EMEA May 20, 2003 Lennart Johnsson
SURFnet4
HP-EMEA May 20, 2003 Lennart Johnsson
SAC
PAC
MAC
AC-1PEC
PC-1GAL
EAC
AC-2PEC
North American Crossing
Global Crossing Wavelength Deployment (06/01)
Global Crossing: 100,000 route miles
HP-EMEA May 20, 2003 Lennart Johnsson
Transpacific Cables Up To June 2001TPC-5
China-US
Japan-US
PC1
HP-EMEA May 20, 2003 Lennart Johnsson
The Battle of the Atlantic• Capacity coming online Gbps* RFS
– Level 3/Global Crossing (Project Yellow) 1,280 3Q00– TAT-14 (Club) 640 4Q00– FLAG Atlantic-1 (FLAG/GTS) 2,560** 2Q01– Hibernia (360networks, Inc.) 1,920 2Q01– Atlantic Crossing -2 (Global Crossing) 2,560*** 1Q01– TyCom Global Network 2,560 4Q01– Oxygen No Go! -------– Total 8,960
* = Design capacity** = Teleglobe buying 2 fibers*** = Cancelled, AC-2 joining Level 3
Does not include C&W Apollo cable (RFS 2003)
HP-EMEA May 20, 2003 Lennart Johnsson
HP-EMEA May 20, 2003 Lennart Johnsson
LHC - DataTag
Tier0/1 facilityTier2 facility
10 Gbps link
2.5 Gbps link
622 Mbps link
Other link
Tier3 facility
HP-EMEA May 20, 2003 Lennart Johnsson
Wavelength Disk Drives
Vancouver
Computer data continuously circulates around the WDD
Calgary
Regina
Winnipeg
Ottawa
Montreal
Toronto
Halifax
St. John’s
Fredericton
Charlottetown
CA*net 3/4
WDD Node
HP-EMEA May 20, 2003 Lennart Johnsson
Exponentials (and Coefficients)
• Network vs. computer performance– Computer speed doubles every 18 months– Network speed doubles every 9 months– Difference = order of magnitude per 5 years
• 1986 to 2000– Computers: x 500– Networks: x 340,000
• 2001 to 2010– Computers: x 60– Networks: x 4000
Scientific American (Jan-2001)Slide courtesy of Ian Foster
HP-EMEA May 20, 2003 Lennart Johnsson
WirelessWLAN – 802.11b 11 Mbps (2000)– 802.11a 54 Mbps (2002)– …Cellular– 9.6 kbps (today)– 3G up to 2 Mbps (2001/2002 - 2005)– 4G up to 200 Mbps (2010)Satellite- Kbps - Gbps
HP-EMEA May 20, 2003 Lennart Johnsson
Access Technologies
HP-EMEA May 20, 2003 Lennart Johnsson
0
100
200
300
400
500
600
700
800
In M
illio
ns
1992 1993 1994 1995 1996 1997 1998 1999 2000 2001
Cell SubscriptionsInternet Hosts
Growth of Cell vs. Internet
HP-EMEA May 20, 2003 Lennart Johnsson
Wireless …
0
20,000,000
40,000,000
60,000,000
80,000,000
100,000,000
120,000,000
No.
of U
nits
1994 1995 1996 1997 1998 1999 2000Year
US Appliance Shipment Statistics 1994-2001
Home Electronics Fixed Home AppliancesPortable Home Appliances
0
10,000
20,000
30,000
40,000
50,000
60,000
70,000
Uin
ts (t
hous
ands
)
2001 2002 2003 2004 2005 2006 2007 2008 2009
Global Assemby of Light Vehicles
Light TruckCarGlobal Total
HP-EMEA May 20, 2003 Lennart Johnsson
Polymer Radio Frequency Identification Transponder
http://www.research.philips.com/pressmedia/pictures/polelec.html
HP-EMEA May 20, 2003 Lennart Johnsson
Smart Dust - UCBRF Mote
RF Mini Mote ILaser Mote
Sensor
IrDA Mote
http://robotics.eecs.berkeley.edu/~pister/SmartDust/
RF Mini Mote II
Laser Mote with CCD
HP-EMEA May 20, 2003 Lennart Johnsson
Instrumentation of CriticalCivil Infrastructure
New Bay Bridge Tower with Lateral Shear Links
Cal-(IT)2 Will Develop and Install Wireless Sensor ArraysLinked to Crisis Management
Control Rooms
Source: UCSD Structural Engineering Dept.
HP-EMEA May 20, 2003 Lennart Johnsson
Neptune Undersea Grid
Air Quality Measurementand Control
Surface dataRadar dataBallon dataSatellite data
Real-time data
NCAR
Sensor Networks
HP-EMEA May 20, 2003 Lennart Johnsson
Your Body On-LineNext Step—Putting You On-Line!– Wireless Internet Transmission– Key Metabolic and Physical Variables– Model -- Dozens of 25 Processors and
60 Sensors / Actuators Inside of our Cars
Post-Genomic Individualized Medicine– Combine
• Genetic Code • Body Data Flow
– Use Powerful AI Data Mining Techniques
www.bodymedia.com
Courtesy of Larry Smarr
HP-EMEA May 20, 2003 Lennart Johnsson
March 28, 2000Fort Worth Tornado
Courtesy Kelvin Droegemeier
HP-EMEA May 20, 2003 Lennart Johnsson
In 1988 … NEXRAD Was Becoming a Reality
s fn
Courtesy Kelvin Droegemeier
HP-EMEA May 20, 2003 Lennart Johnsson
Bioimaging
Photooxidation of Neuron Filled with Lucifer Yellow
After Photooxidation
HVEM
Tomographic
Volume
Branching-tree in 3D with density measures at the tips- primary data: arborization
Tool: Neurolucida
Branching-tree in 3D with density measures at the tips
- more detailed measurements- branches are classified- each twig is a surface object
To be spatially related to original cell
HP-EMEA May 20, 2003 Lennart Johnsson
THE THE BBIOMEDICAL IOMEDICAL IINFORMATICS NFORMATICS RRESEARCH ESEARCH NNETWORKETWORK
LAYOUT FOR PHASE 1LAYOUT FOR PHASE 1
HP-EMEA May 20, 2003 Lennart Johnsson
Distributed Databases
• Collection and processing of PET and fMRI data from neurological experiments
• Partners: KI Neuroscience, KTH (PDC, TCS, CVAP), UU, Forewiss, Active Knowledge
www.neurogenerator.org
Neurogenerator
HP-EMEA May 20, 2003 Lennart Johnsson
Submission Interface
Metadata
Submission Interface
Internet, DAT-tape etc.
PDC/KI
Raw Data
database schema
Population
Manual inspection
Format conversion Segmentation Normalization Statistical Analysis
Processing chains
Result databases
Workflow management
Workflow User Interface
User Interface
Visualization
Neurogenerator
HP-EMEA May 20, 2003 Lennart Johnsson
500 Å
JEOL3000-FEGLiquid He stageNSF support
No. of Particles Needed for 3-D Reconstruction
B = 100 Å2
8.5 Å 4.5 Å6,000 5,000,000
Resolution
B = 50 Å2 3,000 150,0008.5 Å Structure of the
HSV-1 Capsid
HP-EMEA May 20, 2003 Lennart Johnsson
HP-EMEA May 20, 2003 Lennart Johnsson
EMEN Database•Archival•Data Mining•Management
VitrificationRobot Particle Selection
Power SpectrumAnalysis
Initial3D Model
ClassifyParticles
Reproject3D Model
AlignAverageDeconvolute
Build New3D Model
EMAN
Micrographs
• 4 - 64 Mpixels, 16-bit (8 – 128 MB)
• 100 – 200/day per lab
• 10 – 1,000 particles per micrograph
• Several TB/yr
Project
• 200 – 10,000+ micrographs
• 10,000 – 10,000,00 particles
• 10k – 1,000k pixels/particle
• Up to hundreds of PFlops
EM imaging
HP-EMEA May 20, 2003 Lennart Johnsson
AstrophysicsX-ray Optical
Infrared Radio
Crab Nebula in 4 Spectral Regions
HP-EMEA May 20, 2003 Lennart Johnsson
Ongoing Astronomical Mega-Surveys
• Large number of new surveys– Multi-TB in size, 100M objects or larger– In databases– Individual archives planned and under way
• Multi-wavelength view of the sky– > 13 wavelength coverage within 5 years
• Impressive early discoveries– Finding exotic objects by unusual colors
• L,T dwarfs, high redshift quasars– Finding objects by time variability
• Gravitational micro-lensing
MACHO2MASSSDSSDPOSSGSC-IICOBE MAPNVSSFIRSTGALEXROSATOGLE...
MACHO2MASSSDSSDPOSSGSC-IICOBE MAPNVSSFIRSTGALEXROSATOGLE...
HP-EMEA May 20, 2003 Lennart Johnsson
AstronomyComing Floods of Data
• The planned Large Synoptic Survey Telescope will produce over 10 petabytes per year by 2008!– All-sky survey every few days, so will have
fine-grain time series for the first time
HP-EMEA May 20, 2003 Lennart Johnsson
Swedish Space Corporation:ODINresearch satellite
Esrange
PDC
Remote Storage
HP-EMEA May 20, 2003 Lennart Johnsson
High Energy Physics
Tier2 Centre ~1 TIPS
Online System
Offline Processor Farm
~20 TIPS
CERN Computer Centre
FermiLab ~4 TIPSFrance Regional Centre
Italy Regional Centre
Germany Regional Centre
InstituteInstituteInstituteInstitute ~0.25TIPS
Physicist workstations
~100 MBytes/sec
~100 MBytes/sec
~622 Mbits/sec
~1 MBytes/sec
There is a “bunch crossing” every 25 nsecs.There are 100 “triggers” per secondEach triggered event is ~1 MByte in size
Physicists work on analysis “channels”.Each institute will have ~10 physicists working on one or more channels; data for these channels should be cached by the institute server
Physics data cache
~PBytes/sec
~622 Mbits/sec or Air Freight (deprecated)
Tier2 Centre ~1 TIPS
Tier2 Centre ~1 TIPS
Tier2 Centre ~1 TIPS
Caltech ~1 TIPS
~622 Mbits/sec
Tier 0Tier 0
Tier 1Tier 1
Tier 2Tier 2
Tier 4Tier 4
1 TIPS is approximately 25,000 SpecInt95 equivalents
HP-EMEA May 20, 2003 Lennart Johnsson
Tokyo XP(Chicago)
STAR TAP
TransPAC vBNS
(San Diego)SDSC
NCMIR(San Diego)
UCSD
UHVEM(Osaka, Japan)
CRL/MPT
NCMIR(San Diego)
UHVEM(Osaka, Japan)
1st
2nd
Globus
Telemicroscopy(slide courtesy Mark Ellisman@UCSD)
HP-EMEA May 20, 2003 Lennart Johnsson
DistributedVisualization
EnVis@SC98
HP-EMEA May 20, 2003 Lennart Johnsson
28 Sep 00 - #17NORDUnet 2000
GEMSvizGEMSviz at at iGRID iGRID 2000 2000
STAR TAP
NORDUnet
APAN
INET
ParalleldatorcentrumKTH Stockholm
Universityof Houston
Computational Steering
HP-EMEA May 20, 2003 Lennart Johnsson
Grid Application Software
HP-EMEA May 20, 2003 Lennart Johnsson
Grid Application Development Software (GrADS)
HP-EMEA May 20, 2003 Lennart Johnsson
• Develop methodologies and tools for building libraries and mathematical software that adapts to applications and execution environments
• Produce software useable by ASC program developers
• Develop new algorithms and provide code therefore
Goals
HP-EMEA May 20, 2003 Lennart Johnsson
• Algorithmic– Multiple data structures and their interaction– Unfavorable data access pattern (big 2n
strides)– High efficiency of the algorithm
• low floating-point v.s. load/store ratio
– Additions/multiplications unbalance • Version explosion
– Verification– Maintenance
Challenges
HP-EMEA May 20, 2003 Lennart Johnsson
• Automatic algorithm selection – polyalgorithmicfunctions– Entirely different algorithms, exploit decomposition of
operations,…• Code generation from high-level descriptions• Extensive application independent compile-time
analysis• Integrated performance modeling and analysis• Run-time application and execution environment
dependent composition • Automated installation process
Approach
HP-EMEA May 20, 2003 Lennart Johnsson
LibraryRoutineUser
User makes a sequential callto a numerical library routine.The Library Routine has “crafted code”which invoke other components.
GrADS Library Sequence
Slide courtesy Jack Dongarra
HP-EMEA May 20, 2003 Lennart Johnsson
GrADS Library SequenceLibraryRoutineUser Resource
Selector
Library Routine calls a grid based routine to determine which resources are possible for use. The Resource Selector returns a “bag of processors” (coarse grid) that are available.
Slide courtesy Jack Dongarra
HP-EMEA May 20, 2003 Lennart Johnsson
LibraryRoutineUser Resource
Selector
PerformanceModel
The Library Routinecalls the Performance Modeler to determine thebest set of processors to usefor the given problem. May be done by evaluating aformula or running a simulation.May assign a number of processes toa processor. At this point have a fine grid.
GrADS Library Sequence
Slide courtesy Jack Dongarra
HP-EMEA May 20, 2003 Lennart Johnsson
LibraryRoutineUser Resource
Selector
PerformanceModel
ContractDevelopment
The Library Routine calls theContract Development routineto commit the fine grid for this call. A performance guarantee is generated.
GrADS Library Sequence
Slide courtesy Jack Dongarra
HP-EMEA May 20, 2003 Lennart Johnsson
LibraryRoutineUser
ResourceSelector
PerformanceModel
ContractDevelopment
AppLauncher
“mpirun –machinefile fine_grid grid_linear_solve”
GrADS Library Sequence
Slide courtesy Jack Dongarra
HP-EMEA May 20, 2003 Lennart Johnsson
GrADS - Scalapack
HP-EMEA May 20, 2003 Lennart Johnsson
GrADS– Scalapack
HP-EMEA May 20, 2003 Lennart Johnsson
Sample Application Cactus
HP-EMEA May 20, 2003 Lennart Johnsson
Cactus – Migration
HP-EMEA May 20, 2003 Lennart Johnsson
Key:Fixed library code
Generated code
Code generator
Unparser Scheduler
Optimizer Initializer(Algorithm Abstraction)
FFT CodeGenerator
Library ofFFT Modules
InitializationRoutines
Mixed-Radix(Cooly-Tukey)
Prime FactorAlgorithm
Split-RadixAlgorithm
Rader'sAlgorithm
ExecutionRoutines
Utilities
UHFFTLibrary
UHFFT Architecture
Funded in part by the Alliance (NSF) and LACSI (DoE)
HP-EMEA May 20, 2003 Lennart Johnsson
Performance TuningMethodology
Input ParametersSystem specifics,
UHFFT Code generator
Library of FFT modules
Performancedatabase
User options
Installation
Input ParametersSize, dim., …
InitializationSelect best plan
ExecutionCalculate one or more FFTs
Run-time
HP-EMEA May 20, 2003 Lennart Johnsson
UHFFT CodeletPerformance
HP-EMEA May 20, 2003 Lennart Johnsson
UHFFT CodeletPerformance
HP-EMEA May 20, 2003 Lennart Johnsson
GrADS
HP-EMEA May 20, 2003 Lennart Johnsson
GrADS team•Ken Kennedy•Ruth Aydt/Celso Mendez•Francine Berman/Henry Cassanova•Andrew Chien•Keith Cooper•Jack Dongarra•Ian Foster•Dennis Gannon•Lennart Johnsson•Carl Kesselman•Dan Reed•Richard Wolski•Linda Torczon•Many students
NSF funded under NGS
HP-EMEA May 20, 2003 Lennart Johnsson
PDC Grid Deployment Projects
HP-EMEA May 20, 2003 Lennart Johnsson
SweGrid – The First Swedish National Gird Project
• HPC2N• UPPMAX• PDC• NSC• UNICC• LUNARC
• A PC cluster with data cache at each site.
• Shared backup and archival storage Funded by K.A. Wallenberg Foundation and
The Swedish research Council (SNIC)
HP-EMEA May 20, 2003 Lennart Johnsson
NGC - Nordic Grid ConsortiumGrid Collaboration Between Nordic HPC Centres
Joint Grid middleware and application development
Networked computation, storage and visualization facilities as well as scientific instruments
Production grid deployment
Cycle sharing
Common Portal for job submission
Diversity of Platforms
Founding Centres:
www.nordicgrid.net
Collective resources:
Processing 5 TFlop
Primary storage 2 TB
Disk 10 TB
Tape 100 TB
Visualization facilities
The Power of CollaborationThe Power of Collaboration
Harnessing our StrengthsHarnessing our Strengths
HP-EMEA May 20, 2003 Lennart Johnsson
Type of SystemCSC Parallab PDC Total
DM 523 / 90 84 / 36 607/ 127SMP 2445 / 590 499 / 197 187 / 106 3,131/ 892DSM 79 / 168 25 / 12 4 / 4 108/ 184
V/P 7 / 6 7/ 6PC-Cluster 81 / 66 273 / 82 354/ 147
Total 3047 / 848 605 / 274 554 / 234 4,207/1,356
GFlop/s / GByte
Type of SystemCSC Parallab PDC Total
DM and SMP 2,100 5,800 1,584 / 704 9,484 / 704DSM 560 35 200 / 594 795 / 594
V/P 128 / 128PC-Cluster --- /3,288 -- /3,288
Total 2,660 5,835 1,912 /4,586 10,407 /4,586
GByte Disk (global / local)
Type of SystemCSC Parallab PDC Total
Archive 10 32 5 47Backup 18 20 38
Total 28 32 25 85
TByte on Tape
NGC Resources
HP-EMEA May 20, 2003 Lennart Johnsson
Security - Kerberos
• KTH-krb– Free implementation of
Kerberos 4 – http:/www.pdc.kth.se/kth-krb
• Heimdal– Free implementation of
Kerberos 5 http://www.pdc.kth.se/heimdal
• Distributed with Debianand Free BSD
HP-EMEA May 20, 2003 Lennart Johnsson
PDC contribution
• Added GSI (Globus Security Infrastructure) authentication to CASTOR– Public Key Encryption
• X.509 certificates– SSL as transport protocol– Adheres to the GSS-API
standard
HP-EMEA May 20, 2003 Lennart Johnsson
• Examples:– Problem solving environment for
computational chemistry– Application web portals
• Issues:– Remote job submission,
monitoring, and control– Resource discovery– Distributed data archive– Security– Accounting
• Course by PNNL at PDC Fall 1999
Problem Solving Environments
ECCEPacific Northwest National Laboratory
HP-EMEA May 20, 2003 Lennart Johnsson
The importance users of the UK Grid Support Centre attach to Grid technologies
Provide support, training, and outreach programs aimed at ensuring the success of European deployment and use of Grids.Operate a dedicated, professional operations capability for essential infrastructure elements and applications.
Meet the EGSCMeet the EGSC
International cooperation is critical for the success of transnational GridsSuccessful Grid operations requires cooperation of multiple organizations for
– Problem resolution– System support– User support– Software validation
Sharing of scarce human expertiseEducation, Training and outreach
Why the EGSC ?Why the EGSC ?
Help-desk– Communication methods
Web, E-mail, Phone– Middleware expertise– Application expertise– Regional presence
Integrated bug-trackingMonitoring - statisticsCertification of software/sitesInteraction with key middleware R&D projects, such as
Globus, NMI, …..
EGSC Strengths EGSC Strengths and Servicesand Services
To determine the most effective way to reach the Grid Community a survey was taken to identify a range of technologies and support important to users. There were two parts to the UK Grid Support Center survey:
A) Importance of various e-technologies (13 of them + 1 for others)
B) Importance of existing support activities in the GSC (11 of them + 1 for others)
Asked them to score each for:A) Use within the project (Use)B) Need for support from the Grid Support Centre (Need)
In the responses there was a high correlation between Use and Need
The technologies rated were the following:
01 GT-202 Web Services03 GT-304 OGSA-DAI05 Condor06 J2EE07 .Net08 Windows09 Linux10 Other OS11 RMS12 Data Repository13 HPC14 Others
The support activities rated were the following:
01 Starter kits02 Evaluation Reports03 Web Site04 Help desk05 Technical support06 Reference systems07 Training08 Certificate Authority09 Software development10 Technical Liaisons11 Engineering Task Force (GT-2 grid)12 Others
0.0%
5.0%
10.0%
15.0%
20.0%
25.0%
GT-3
OGSA
-DAI
Web
Ser
vice
sDa
ta R
epos
itory
Linu
x
GT-2
Othe
rs
HPC
Cond
or
J2EE
RMS
Win
dows
Othe
r OS
.Net
What the community wantsWhat the community wants
The primary contacts for The EGSC are at PDC (Sweden)CERN (Switzerland) and CLRC for the e-Science Programme (UK)
Cambridge
Newcastle
Edinburgh
Oxford
Glasgow
Manchester
Cardiff
SouthamptonLondon
Belfast
DL
RAL Hinxton
0.0%
5.0%
10.0%
15.0%
20.0%
25.0%
Cert
Auth
ority
Trai
ning
Tech
sup
port
Help
des
k
Web
Site
Star
ter k
its
ETF
Ref s
yste
ms
Tech
Lia
ison
s
Eval
Rep
orts
Softw
are
deve
lop
Oth
ers
The importance users of the UK Grid Support Centre attach to Grid support tasks within its remit
www.grid support.org-
HP-EMEA May 20, 2003 Lennart Johnsson
IA-64 Linux Clusters
PDC, 90 nodes
TLC2, 60 nodes
HP-EMEA May 20, 2003 Lennart Johnsson
Astrophysics
Astrophysics code
364
100
314
0 50 100 150 200 250 300 350 400
HP rx2600 1GHz IA-64 ecc 6.0 (2 procs)
AMD MP1800+
Fujitsu VX
Relative performance
• Astrophysics code• 3.6 speedup• Code written for vector
supercomputer• Itanium2 outperforms
vector machine
Bertil Dorch: Stockholm Observatory
HP-EMEA May 20, 2003 Lennart Johnsson
Materials Science
• Two times as fast as the present fastest Swedish system
Shiwu Gao: Theoretical Materials and Surface Physics,Chalmers
23
100
47
38
56
0 20 40 60 80 100 120
IBM SP pwr2160MHz (W)
HP 900MHzItanium2
IBM SP pwr3375MHz (K)
AMD AthlonXP1600+
Intel Xeon 2.2GHz
Relative Performance
HP-EMEA May 20, 2003 Lennart Johnsson
• Simulation of lightning striking an airliner– Sustained 1 GFlop/s
range performance– With 0.5TByte Memory – order of magnitude
larger models than previous record of 1 Billion cells
Computational Electromagnetics
Parallel and Scientific Computing Institute, KTH
HP-EMEA May 20, 2003 Lennart Johnsson
Benchmarks - Stream
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
copy scale addv t r iad
Re l a t i v e P e r f or ma nc e
AMD At hlon XP1800+
Pent ium4@2400Mhz
NSC P4 Xeon@2200, icc 7.0
IBM SP pwr4 1.1GHz
It anium2@900Mhz (zx1, Linux)
It anium2@1000Mhz (E8870)
HP-EMEA May 20, 2003 Lennart Johnsson
Benchmarks - Stream
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Norm
aliz
ed to
Itan
ium
2 90
0 M
Hz (z
x1, L
inux
)
copy scale addv triadRelative Performance
Athlon XP1800+Pentium 4 Xeon @ 2200 MHzPower4 @ 1100 MHzItanium2 @ 900 MHz
HP-EMEA May 20, 2003 Lennart Johnsson
Benchmarks - NPB
0
0.2
0.4
0.6
0.8
1
1.2
1.4
BT SP LU FT MG IS CG EP
Re l a t i v e P e r f or ma nc e
AMD At hlon XP1800+
AMD At hlon MP 2000+
Pent ium4@2400Mhz
NSC P4 Xeon@2200, icc 7.0
IBM SP pwr4 1.1GHz
It anium2@900Mhz (zx1, Linux)
It anium2@1000Mhz (E8870)
MIPS R14000@600Mhz
HP-EMEA May 20, 2003 Lennart Johnsson
Benchmarks - NPB
0
0.2
0.4
0.6
0.8
1
1.2
1.4
Norm
aliz
ed to
Itan
ium
2 90
0 M
Hz (z
x, L
inux
)
BT SP LU FT MG IS CG EPRelative Performance
NPB Benchmarks Subset
Athlon XP1800+Athlon MP2000+ Pentium 4 Xeon @ 2200 MHzPower4 @ 1100 MHz Itanium2 @ 900 MHz
HP-EMEA May 20, 2003 Lennart Johnsson
Benchmarks - CEM
0.375 0.3971
0.2279
0.4632
1
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
AMD At hlon XP1600+
AMD At hlon MP2000+
IBM SP pwr2 160MHz (W)
IBM SP pwr3 375MHz (K)
It anium2@900Mhz (zx1, Linux)
HP-EMEA May 20, 2003 Lennart Johnsson
Benchmarks - CEM
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Relat ive Perf ormance
AMD At hlon XP1600+
AMD At hlon MP2000+
NSC P4 Xeon@2200, icc 7.0
Pent ium4@2400Mhz
IBM SP pwr4 1.1GHz
It anium2@900Mhz (zx1, Linux)
It anium2@1000Mhz (E8870)
MIPS R14000@600Mhz
HP-EMEA May 20, 2003 Lennart Johnsson
Benchmarks - CEM
0
0.1
0.2
0.3
0.4
0.5
0.6
0.7
0.8
0.9
1
Nor
mal
ized
Itan
ium
2 90
0 M
hz (z
x1, L
inux
)
Relative Performance
Benchmarks for Yee Subset
Athlon MP2000+ Pentium4 Xeon @ 2200 MHzPower4 @ 1100 MHz Itanium2 @ 900 MHz
HP-EMEA May 20, 2003 Lennart Johnsson
The End