45
Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois [email protected]

Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Embed Size (px)

Citation preview

Page 1: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Dynamic Grid Simulations for Science and Engineering

Ed SeidelMax-Planck-Institut für

Gravitationsphysik (Albert Einstein Institute)

NCSA, U of [email protected]

Page 2: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Einstein’s Equations and Gravitational WavesTwo major motivations for numerical relativity

Exploring Einstein’s General RelativityWant to develop theoretical lab to probe this fundamental theory

Fundamental theory of Physics (Gravity) Among most complex equations of physics

– Dozens of coupled, nonlinear hyperbolic-elliptic

equations with 1000’s of terms

– Barely have capability to solve after a century

Predict black holes, gravitational waves, etc, but want much more

Exciting new field about to be born: Gravitational Wave Astronomy LIGO, VIRGO, GEO, LISA, … ~ $1 Billion worldwide! Fundamentally new information about Universe A last major test of Einstein’s theory: do they exist?

– Eddington: “Gravitational waves propagate at the speed of thought”

One century later, both of these developments happening at the same time: very exciting coincidence!

Page 3: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Gravitational Waves Astronomy: New Field, Fundamental New Information about the Universe

Multi-Teraflop Computation, AMR, Elliptic-Hyperbolic

Numerical Relativity

Page 4: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Computational Needs for 3D Numerical Relativity:Can’t fulfill them now, but about to change...

t=0

t=100

Multi TFlop, Tbyte machine essential: coming!

Explicit Finite Difference Codes ~ 104 Flops/zone/time step ~ 100 3D arrays

Require 10003 zones or more ~1000 Gbytes Double resolution: 8x memory, 16x

Flops Parallel AMR, I/O essential A code that can do this could be

useful to other projects (we said this in all our grant proposals)! Last few years devoted to making this

useful across disciplines… All tools used for these complex

simulations available for other branches of science, engineering...

• InitialData: 4 coupled nonlinear elliptics• Evolution

• hyperbolic evolution• coupled with elliptic eqs.

• Choose Gauge• Interpret Physics

Page 5: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Any Such Computation Requires Incredible Mix of Varied Technologies and Expertise!

Many Scientific/Engineering Components Physics, astrophysics, CFD, engineering,...

Many Numerical Algorithm Components Finite difference methods? Elliptic equations: multigrid, Krylov subspace, preconditioners,... Mesh Refinement?

Many Different Computational Components Parallelism (HPF, MPI, PVM, ???) Architecture Efficiency (MPP, DSM, Vector, PC Clusters, ???) I/O Bottlenecks (generate gigabytes per simulation, checkpointing…) Visualization of all that comes out!

Scientist/engineer wants to focus on top bullet, but all required for results...

Such work cuts across many disciplines, areas of CS…

Page 6: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Grand Challenge SimulationsScience and Eng. Go Large Scale: Needs Dwarf Capabilities

NASA Neutron Star Grand Challenge 5 US Institutions Solve problem of

colliding neutron stars (try…)

EU Network Astrophysics 10 EU

Institutions, 3 years

Try to finish these problems …

Entire Community becoming Grid enabled

Examples of Future of Science & Engineering Require Large Scale Simulations,

beyond reach of any machine Require Large Geo-distributed

Cross-Disciplinary Collaborations Require Grid Technologies, but

not yet using them! Both Apps and Grids Dynamic…

NSF Black Hole Grand Challenge 8 US

Institutions, 5 years

Solve problem of colliding black holes (try…)

Page 7: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

“Big mesh sizes”

Collaboration technology needed: A scientist’s view on a large scale computational problem:

Initial Data

Very efficientEvolution Algorithms Complex

Analysis routines

(Better be Fortran)

“Easy job submission”

“Large Data Output”

“Parallel would be great”

Scientists cannot be required to become experts in computer science.

Page 8: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Collaboration technology needed: A computer scientist’s view on the same problem:

Next Gen.Highspeed Comm.

Layers

High-performanceparallel I/O

Code instrumentation + steering

Load schedulingInteractiveVisualiz.

Metacomputing

“Programmers, use this!”

Computer scientists will not write the applications that make use of their technology

Page 9: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

CactusNew concept in community developed simulation code infrastructure

Developed as response to needs of large scale projects Numerical/computational infrastructure to solve PDE’s Freely available, Open Source community framework: spirit of gnu/linux

Many communities contributing to Cactus Cactus Divided in “Flesh” (core) and “Thorns” (modules or collections of subroutines) Multilingual: User apps can be Fortran, C, C++; automated interface between them

Abstraction: Cactus Flesh provides API for virtually all CS type operations Storage, parallelization, communication between processors, etc Interpolation, Reduction IO (traditional, socket based, remote viz and steering…) Checkpointing, coordinates

“Grid Computing”: Cactus team and many collaborators worldwide, especially NCSA, Argonne/Chicago, LBL. Revolution coming...

Page 10: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Modularity of Cactus...

Application 1

Cactus Flesh

Application 2 ...

Sub-app

AMR (GrACE, etc)

MPI layer 3 I/O layer 2

Unstructured...

Globus Metacomputing Services

User selectsdesired functionality…Code created...

Abstractions...

Remote Steer 2MDS/Remote Spawn

Legacy App 2

Symbolic Manip App

Page 11: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Cactus Driver API

Cactus provides standard interfaces for Parallelization, Interpolation, Reduction, I/O, etc. (e.g. CCTK_MyProc, CCTK_Reduce, ....)

Reduction operation across processors:

CCTK_Reduce(...)

MPI/Globus (thorn PUGH)

PVM

OpenMP

Nothing...

Page 12: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Cactus Community DLR

Geophysics(Bosl)

Numerical Relativity CommunityCornell

Crack prop.

NASA NS GC

Livermore

SDSS(Szalay)

Intel

Microsoft

Clemson

“Egrid”NCSA, ANL, SDSC

AEI Cactus Group(Allen)

NSF KDI(Suen, Foster, ES)

EU Network(Seidel)

Astrophysics(Zeus)

US Grid ForumDFN Gigabit

(Seidel)

“GRADS”(Kennedy, Foster,

Dongarra, et al)

ChemEng(Bishop)

San Diego, GMD, Cornell

Berkeley

Page 13: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Future view: much of it here already... Scale of computations much larger

Complexity approaching that of Nature Simulations of the Universe and its constituents

– Black holes, neutron stars, supernovae– Airflow around advanced planes, spacecraft– Human genome, human behavior

Teams of computational scientists working together Must support efficient, high level problem description Must support collaborative computational science Must support all different languages

Ubiquitous Grid Computing Very dynamic simulations, deciding their own future Apps find the resources themselves: distributed, spawned, etc... Must be tolerant of dynamic infrastructure (variable networks, processor

availability, etc…) Monitored, viz’ed, controlled from anywhere, with colleagues anywhere

else...

Page 14: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Our Team Requires Grid Technologies, Big Machines for Big Runs

WashU

NCSA

Hong Kong

AEI

ZIB

Thessaloniki

How Do We:• Maintain/develop Code?• Manage Computer Resources?• Carry Out/monitor Simulation?

Paris

Page 15: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Grid Simulations: a new paradigm Computational Resources Scattered Across the World

Compute servers Handhelds File servers Networks Playstations, cell phones etc…

How to take advantage of this forscientific/engineering simulations?

Harness multiple sites and devices Simulations at new level of complexity and scale

QuickTime™ and aPhoto decompressor

are needed to see this picture.

QuickTime™ and aPhoto - JPEG decompressor

are needed to see this picture.

Page 16: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Many Components for Grid Computing:all have to work for real applications

1. Resources: Egrid (www.egrid.org) A “Virtual Organization” in Europe for Grid Computing Over a dozen sites across Europe Many different machines

2. Infrastructure: Globus Metacomputing Toolkit Develops fundamental technologies needed to build

computational grids.  Security: logins, data transfer Communication Information (GRIS, GIIS)

QuickTime™ and aPhoto - JPEG decompressor

are needed to see this picture.

Page 17: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Components for Grid Computing, cont.

3. Grid Aware Applications (Cactus example): Grid Enabled Modular Toolkits for Parallel

Computation: Provide to Scientist/Engineer… Plug your Science/Eng. Applications in! Must Provide Many Grid Services

– Ease of Use: automatically find resources, given some need!

– Distributed simulations: use as many machines as needed!

– Remote Viz and Steering, tracking: watch what happens!

Collaborations of groups with different expertise: no single group can do it! Grid is natural for this…

Page 18: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Cactus & the Grid

Cactus Application ThornsDistribution information hidden from programmer

Initial data, Evolution, Analysis, etc

Grid Aware Application ThornsDrivers for parallelism, IO, communication, data mapping

PUGH: parallelism via MPI (MPICH-G2, grid enabled message passing library)

Grid Enabled Communication Library

MPICH-G2 implementation of MPI, can run MPI programs across heterogenous computing

resources

Standard MPI

SingleProc

Page 19: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Cactus Computational Toolkit

Science, Autopilot, AMR, Petsc, HDF, MPI, GrACE, Globus, Remote Steering...

A Portal to Computational Science: The Cactus Collaboratory

1. User has scienceidea...

3. Selects Appropriate Resources...

5. Collaborators log in to monitor...

4. Steers simulation, monitors performance...

2. Composes/Builds Code Components w/Interface...

Want to integrate and migrate this technology to the generic user…

Page 20: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Grid Applications so far... SC93 - SC2000 Typical scenario

Find remote resource (often using multiple computers) Launch job (usually static, tightly coupled) Visualize results(usually in-line, fixed)

Need to go far beyond this Make it much, much easier

– Portals, Globus, standards Make it much more dynamic, adaptive, fault tolerant Migrate this technology to general user

Metacomputing the Einstein Equations:Connecting T3E’s in Berlin, Garching, San Diego

Page 21: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Supercomputing super difficultConsider simplest case: sit here, compute there

Accounts for one AEI user (real case): berte.zib.de denali.mcs.anl.gov golden.sdsc.edu gseaborg.nersc.gov harpo.wustl.edu horizon.npaci.edu loslobos.alliance.unm.edu mcurie.nersc.gov modi4.ncsa.uiuc.edu ntsc1.ncsa.uiuc.edu origin.aei-potsdam.mpg.de pc.rzg.mpg.de pitcairn.mcs.anl.gov quad.mcs.anl.gov rr.alliance.unm.edu sr8000.lrz-muenchen.de

16 machines, 6 different usernames, 16 passwords, ...

Page 22: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Cactus Portal (Michael Russell, et al) KDI ASC Project Technology: Globus, GSI, Java,

DHTML, Java CoG, MyProxy, GPDK, TomCat, Stronghold

Allows submission of distributed runs

Used for the ASC Grid Testbed (SDSC, NCSA, Argonne, ZIB, LRZ, AEI)

Driven by the need for easyaccess to machines

Page 23: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Distributed Computation:Harnessing Multiple Computers

Why would anyone want to do this? Capacity Throughput

Issues Bandwidth Latency Communication needs Topology Communication/computation

Techniques to be developed Overlapping comm/comp Extra ghost zones Compression Algorithms to do this for the scientist…

Experiments 3 T3Es on 2 continents Last week: joint NCSA, SDSC test with 1500 processors

Page 24: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Distributed Terascale Test Gig-E100MB/sec

SDSC IBM SP1024 procs5x12x17 =1020

NCSA Origin Array256+128+1285x12x(4+2+2) =480

OC-12 lineBut only 2.5MB/sec)

17

12

512

5

4 2 2

Solved EE’s for gravitational waves (real code) Tightly coupled, communications required through derivatives Must communicate 30MB/step between machines Time step take 1.6 sec

Used 10 ghost zones along direction of machines: communicate every 10 steps

Compression/decomp. on all data passed in this direction Achieved 70-80% scaling, 200GF (only 20% scaling without tricks)

Page 25: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Remote VisualizationMust be able to watch any simulation live…

IsoSurfaces and GeodesicsComputed inline with simulation

Only geometry sent across networkRaster Images to web browserWorks NOW!!

Arbitrary Grid Functions Streaming HDF5

Amira

Amira

LCA Vision

OpenDX

Any App plugged into Cactus

Page 26: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Remote Visualization - Issues

Parallel streaming Cactus can do this, but readers not yet available on the client side

Handling of port numbers clients currently have no method for finding the port number that Cactus is

using for streaming development of external meta-data server needed (ASC/TIKSL)

Generic protocols: need to develop them, for Cactus and the Grid Data server

Cactus should pass data to a separate server that will handle multiple clients without interfering with simulation

TIKSL provides middleware (streaming HDF5) to implement this

Output parameters for each client

Page 27: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Remote Steering

Remote Viz data

Remote Viz data

XML HTTP

HDF5

Amira

Any Viz Client

Changing any steerable parameter• Parameters• Physics, algorithms• Performance

Page 28: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Remote Steering

Stream parameters from Cactus simulation to remote client, which changes parameters (GUI, command line, viz tool), and streams them back to Cactus where they change the state of the simulation.

Cactus has a special STEERABLE tag for parameters, indicating it makes sense to change them during a simulation, and there is support for them to be changed.

Example: IO parameters, frequency, fields, timestep, debugging flags

Current protocols: XML (HDF5) to standalone GUI HDF5 to viz tools (Amira) HTTP to Web browser (HTML forms)

Page 29: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Thorn HTTPD Thorn which allows

simulation any to act as its own web server

Connect to simulation from any browser anywhere

Monitor run: parameters, basic visualization, ...

Change steerable parameters

See running example at www.CactusCode.org

Wireless remote viz, monitoring and steering

Page 30: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Remote Offline VisualizationViz Client (Amira)

HDF5 VFD

DataGrid (Globus)

DPSS FTP HTTP

Visualization Client

DPSS Server

FTP Server

Web Server

Remote Data Server

Viz in Berlin

4TB distributed across NCSA/ANL/Garching

Only what is needed

Accessing remote data for local visualization

Should allow downsampling, hyperslabbing, etc.

Grid World: file

pieces left all over the world, but logically one file…

Page 31: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Dynamic Distributed ComputingStatic grid model works only in special cases; must make apps able to respond to changing Grid environment...

Many new ideas Consider: the Grid IS your computer:

– Networks, machines, devices come and go– Dynamic codes, aware of their environment, seeking out resources– Rethink algorithms of all types– Distributed and Grid-based thread parallelism

Scientists and engineers will change the way they think about their problems: think global, solve much bigger problems

Many old ideas 1960’s all over again How to deal with dynamic processes processor management memory hierarchies, etc

Page 32: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

New Paradigms for Dynamic Gridsa lot of work to be done to make this happen

Code should be aware of its environment What resources are out there NOW, and what is their current state? What is my allocation? What is the bandwidth/latency between sites?

Code should be able to make decisions on its own A slow part of my simulation can run asynchronously…spawn it off! New, more powerful resources just became available…migrate there! Machine went down…reconfigure and recover! Need more memory…get it by adding more machines!

Code should be able to publish this information to central server for tracking, monitoring, steering… Unexpected event…notify users! Collaborators from around the world all connect, examine simulation.

Page 33: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Grid ScenarioQuickTime™ and a

Photo decompressorare needed to see this picture.

We see something,but too weak.

Please simulateto enhance signal!

WashU Potsdam

Thessaloniki

OK! Resource EstimatorSays need 5TB, 2TF.Where can I do this?

RZG

NCSA

1Tbit/secHong KongQuickTime™ and a

Photo decompressorare needed to see this picture.

Resource Broker:LANL is best match…

Resource Broker:NCSA + Garching

OK, but need 10Gbit/sec…

LANLNow..

LANL

Page 34: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

New Grid Applications Dynamic Staging: move to faster/cheaper/bigger machine

“Cactus Worm” Multiple Universe

create clone to investigate steered parameter (“Cactus Virus”) Automatic Convergence Testing

from intitial data or initiated during simulation Look Ahead

spawn off and run coarser resolution to predict likely future Spawn Independent/Asynchronous Tasks

send to cheaper machine, main simulation carries on Thorn Profiling

best machine/queue choose resolution parameters based on queue ….

Page 35: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

New Grid Applications (2) Dynamic Load Balancing

inhomogeneous loads multiple grids

Portal resource choosing simulation launching management

Intelligent Parameter Surveys farm out to different machines

Make use of Running with management tools such as Condor, Entropia, etc. Scripting thorns (management, launching new jobs, etc) Dynamic use of eg MDS for finding available resources

Page 36: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Go!

Dynamic Grid Computing

Clone job with steered parameter

Queue time over, find new machine

Add more resources

Found a horizon,try out excision

Look forhorizon

Calculate/OutputGrav. Waves

Calculate/OutputInvariants

Find bestresources

Free CPUs!!

NCSA

SDSC RZG

SDSC

LRZ Archive data

Page 37: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

User’s View ... simple!

Page 38: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Cactus Worm: Illustration of basic scenario

Cactus simulation (could be anything) starts, launched from a portal Queries a Grid Information Server, finds available resources Migrates itself to next site, accordingto some criterion Registers new location to GIS, terminates old simulation User tracks/steers, using http, streaming data, etc...…

Continues around Europe… If we can do this, much of what we want can be done!

Page 39: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Grid Application Development Toolkit

Application developer should be able to build simulations with tools that easily enable dynamic grid capabilities

Want to build programming API to easily allow: Query information server (e.g. GIIS)

– What’s available for me? What software? How many processors? Network Monitoring Decision Thorns

– How to decide? Cost? Reliability? Size? Spawning Thorns

– Now start this up over here, and that up over there Authentification Server

– Issues commands, moves files on your behalf (can’t pass-on Globus proxy)

Page 40: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Grid Application Development Toolkit (2) Information Server

– What is running where? Where to connect for viz/steering? What and where are other people in the group running?

– Spawn hierarchies

– Distribute/loadbalance Data Transfer

– Use whatever method is desired

– Gsi-ssh, Gsi-ftp, Streamed HDF5, scp, GASS, Etc… LDAP routines for simulation codes

– Write simulation information in LDAP format– Publish to LDAP server

Stage Executables– CVS checkout of new codes that become connected, etc…

Etc…

If we build this, we can get developers and users!

Page 41: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

ID

EV

IO

AN

AN

AN

AN

Example Toolkit Call: Routine Spawning

Schedule AHFinder at Analysis

{

EXTERNAL=yes

LANG=C

} “Finding Horizons”

Page 42: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Many groups trying to make this happen EU Network

Proposal AEI, Lecce,

Poznan, Brno, Ameterdam, ZIB-Berlin, Paderborn, Compaq, Sun, Chicago, ISI, Wisconsin

Developing this technology…

Page 43: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Grid Related Projects ASC: Astrophysics Simulation Collaboratory

NSF Funded (WashU, Rutgers, Argonne, U. Chicago, NCSA) Collaboratory tools, Cactus Portal Starting to use Portal for production runs

E-Grid: European Grid Forum (GGF: Global Grid Forum) Working Group for Testbeds and Applications (Chair: Ed Seidel) Test application: Cactus+Globus Demos at Dallas SC2000

GrADs: Grid Application Development Software NSF Funded (Rice, NCSA, U. Illinois, UCSD, U. Chicago, U. Indiana...) Application driver for grid software

Page 44: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Grid Related Projects (2)

Distributed Runs AEI (Thomas Dramlitsch), Argonne, U. Chicago Working towards running on several computers, 1000’s of processors

(different processors, memories, OSs, resource management, varied networks, bandwidths and latencies)

TIKSL/GriKSL German DFN funded: AEI, ZIB, Garching Remote online and offline visualization, remote steering/monitoring

Cactus Team Dynamic distributed computing … Grid Application Development Toolkit

Page 45: Dynamic Grid Simulations for Science and Engineering Ed Seidel Max-Planck-Institut für Gravitationsphysik (Albert Einstein Institute) NCSA, U of Illinois

Summary Science/Engineering Drive/Demand Grid Development

Problems very large But practical, fundamentally connected to industry

Grids will fundamentally change research Enable problem scales far beyond present capabilities Enable larger communities to work together (they’ll need to) Change the way researchers/engineers think about their work

Dynamic Nature of Grid makes problem much more interesting Harder Matches dynamic nature of problems being studied

More info www.CactusCode.org www.gridforum.org www.ascportal.org www.zib.de/Visual/projects/TIKSL/