14
© 2008 Pittsburgh Supercomputing Center Computational Chemistry and the March to Petascale Computing. Shawn T. Brown Senior Scientific Specialist Pittsburgh Supercomputing Center Q-Chem Workshop 2009

Computational Chemistry and the March to Petascale computing

  • Upload
    others

  • View
    3

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

Computational Chemistry and the March to

Petascale Computing.

Shawn T. BrownSenior Scientific Specialist

Pittsburgh Supercomputing Center

Q-Chem Workshop 2009

Page 2: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

Overview

Pittsburgh Supercomputing Center

• was established in 1986 through the collaborative efforts of Carnegie Mellon University and the University of Pittsburgh together with Westinghouse Electric Company.

• receives support from several federal agencies, including NSF, NIH and DOE, the Commonwealth of Pennsylvania and private industry.

• has particular strengths in user support and optimization, file systems, networking and biomedical applications.

• Is a resource provider in the TeraGrid since 2002, when NSF expanded the initial Distributed Terascale Facility to integrate PSC's LeMieux, the TeraGrid’s first terascale system.

• works with other TeraGrid partners to harness the full range of information technologies to enable discovery in U.S. science and engineering.

Page 3: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

PSC Research Highlights

• Forecasting severe thunder stormsPSC collaboration with NOAA and the Center for Analysis and

Prediction of Storms (CAPS) in Norman, Oklahoma in spring

forecast experiments has proven the feasibility of numerical

methods to forecast storms, such as supercells that spawn

tornados, with significantly more detail and advance notice than

current NWS operational technology.

• Soil vibration during earthquakesHigh-resoluton simulations of major earthquakes, led by Jacobo

Bielak and David O’Halloran of Carnegie Mellon University, help

to account for how ground motion varies with subsurface geology

and provide the basis to define building codes that provide for the

safest possible structures at reasonable cost.

• Black holes in cosmic evolutionSimulations led by Tiziana DiMatteo of Carnegie Mellon University

included black holes for the first time in large-scale simulations of

cosmic evolution. This work, featured on NOVA and the National

Geographic channel, provided new understanding of how black

holes regulate the growth of galaxies.

Page 4: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

Other PSC Research Highlights

• A national leader in computational approaches in the life sciences

Through PSC’s National Resource for Biomedical Supercomputing (NRBSC), supported by NIH, PSC carries out an extensive program of biomedical training, dissemination and research, including structural biology, realistic cellular modeling and volumetric visualization.

• Addressing the challenges of clean energyThrough partnership with the National Energy Technology Laboratory in Pittsburgh and Morgantown, West Virginia, PSC advances research on clean energy, including development of commercial-scale, coal-gasification to be implemented in a Florida power plant that is anticipated to be the cleanest, most efficient coal-fired plant in the world by 2010.

Page 5: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

TeraGrid Systems at PSC

• BigBenCray XT3 comprising 2,068 compute nodes

(4,136 cores) linked by a custom-designed

interconnect. Twenty-two dedicated IO

processors are also connected. Each compute

node has two 2.6 GHz AMD Opteron processors

sharing two gigabytes of memory and the

network connection.

• PopleSilicon Graphics Altix 4700, 768

processors, 1.5 terabytes, shared memory,

5.0 peak teraflops. Pople became a

TeraGrid resource in July 2008.

Page 6: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

What is the TeraGrid?A unique combination of fundamental CI

components

Page 7: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

ABE (NCSA)/QueenBee (LONI) 9,600 2.33 GHz Intel QC8-16 GB RAM per node

137 TflopsTeraGrid

Big Iron

Ranger (TACC)62,976 2.3 GHz Opteron QC32 GB (16-way SMP) RAM per node

580 Tflops

Kraken (NICS) 66,048 2.3 GHz Opteron QC2 GB RAM per corePowerful SeaStar Interconnect

608 Tflops

(to go to 1 Pflop)

Page 8: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

You want accelerators… we got accelerators.

Lincoln (NCSA)• 192 compute nodes

• (Dell PowerEdge 1950 III dual-

socket nodes with quad-core Intel

Harpertown 2.33GHz

• 96 NVIDIA Tesla S1070 accelerator

units. Each Tesla unit provides

345.6 gigaflops of double-precision

performanceNAMD Performance on Lincoln, presented at SC08

by James Phillips (UIUC) Nov. 2008.

?

Track 2D

Experimental Architecture

Announcement to come in 09

Page 9: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

TeraGrid resources include . . .• Computing – almost 2 Pflops today and growing

– U Tennesee (NICS) system will grow to nearly 1 Pflop peak performance

– PSC to get Track2C system

– Track2D and DataNet Awards

– Centralized help desk for all resource providers

• Visualization - Remote visualization servers and software

• Data - 20+ Petabytes of Storage

– Allocation of data storage facilities

– Over 100 Scientific Data Collections

• Access – Dedicated Cross-country Network

– Shibboleth testbed to facilitate campus access

– Central allocations mechanism

• Human Support

– Advanced Support for TeraGrid Applications (ASTA)

– Education and training events and resources

– Over 20 Science Gateways

Page 10: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

TeraGrid Resources Available for all Domain ScientistsAt no cost to them!

• Integrated, persistent, pioneering

resources

• Significantly improve the ability and

capacity to gain new insights into the

most challenging research questions

and societal problems

• Peer-reviewed, proposal-based access

– Targeted support available as well

• Dedicated staff investment to really

make a difference on complex

problems

– Transformational science

– Must have PI commitment

– Make lessons learned available

for all

Page 11: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

GridChem Cyber-environment for Molecular

Sciences

Page 12: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

Industry Darlings – A TG Chemistry Success

• Zeolites are extensively used in the

refining of gasoline.

• High-throughput Condor pools

available through the TeraGrid used

to screen over 2 million structures

with DFT computations.

• Designers of industrial applications

can use this to explore new

thermodynamically accessible

zeolites.

Michael Deem

Rice University

David Earl

Univ. of Pitt.Crystal structure of zeolite MFI used in catalytic cracking of crude oil

Page 13: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

QM Region

B3LYP/3-21G//CHARMM: QM region contained large section of active site (~260 atoms, ~1400 basis functions) in doublet state.

B3LYP/LANL2DZ//CHARMM: QM region includes entire heme, proximal and distal Histidines, distal PHEs and Hydrogen peroxide (~146 atoms, 850 basis functions) in both doublet and quartet states. A sphere of water covers the monomeric Hemoglobin.

Performed using QChem 3.1 and CHARMM.

Hybrid Density Functional Theory/MM Calculations of Classical Molecular Dynamics Snapshots of Hemoglobins

E = EQM + EMM + E QM/MM

Page 14: Computational Chemistry and the March to Petascale computing

© 2008 Pittsburgh Supercomputing Center

Observations from the cheap seats…

• Scaling Ab-initio Quantum Chemistry is hard…

– Diagonalization

– Memory/Network Bandwidth

– Complicated, very complicated

• We used to be in front, now we are in the rear…

– Molecular Dynamics, Weather Modeling, QCD now dominate HPC.

– New methods, and algorithms need to be developed if we are to take

advantage of Petascale computing• Course grain parallelism exploited.

• New numerical techniques.

• Accelerators is where the action is at!

• N7 is still N7 in parallel.

• It is not how fast your code is… it is what science can be done

with it.