7
36 THIS ARTICLE HAS BEEN PEER-REVIEWED. COMPUTING IN SCIENCE & ENGINEERING C LIMATE I MPACT A NALYSIS Climate Change Modeling: Computational Opportunities and Challenges High-fidelity climate models are the workhorses of modern climate change sciences. In this article, the authors focus on several computational issues associated with climate change modeling, covering simulation methodologies, temporal and spatial modeling restrictions, the role of high-end computing, as well as the importance of data-driven regional climate impact modeling. O ver the past several decades, re- searchers have made significant progress in developing high-fidelity climate models to advance our un- derstanding of climate systems and improve our ability to project future climate scenarios. 1 Mod- els can range from relatively simple radiant-heat transfer models to fully coupled general circula- tion models of the global climate, which discretize and solve the full 3D equations for mass and energy transfer and radiant exchange. Here, we focus on the spatially explicit climate models used to simulate the interactions of Earth’s systems—that is, atmosphere, oceans, land surface, and ice. Researchers have used spatially explicit climate models for many purposes, from the study of cli- mate system dynamics to temperature change caused by emissions of greenhouse gases (such as carbon dioxide). The complex interactions among some or all major Earth system components result in a component-based model design, in which sev- eral different mathematical/numerical approaches are used to model each Earth system component. High-performance computing has become essen- tial to address the spatially explicit requirements of Earth system models. In keeping with the general interests of CiSE readers, we focus here on several computing is- sues associated with climate change modeling. We first review two model systems to demonstrate key design components, and then we explain the un- derlying restrictions associated with the temporal and spatial scale. After that, we discuss the role of high-end computing facilities in climate change sciences and the importance of fostering regional, integrated climate impact analysis. Two Simulation Methodologies Although some models are customized to study climate change in specific regions of interest, cli- mate models are global models in general. The natural interactions among the various Earth climate system components result in tightly coupled modeling systems. Here, we briefly describe two modeling efforts: a fully coupled global climate modeling, and a coupled, nested regional climate modeling. Fully Coupled Global Modeling An excellent example of the fully coupled global modeling efforts is the National Center for Atmospheric Research’s Community Climate 1521-9615/11/$26.00 © 2011 IEEE COPUBLISHED BY THE IEEE CS AND THE AIP Dali Wang, Wilfred M. Post, and Bruce E. Wilson Oak Ridge National Laboratory

Climate Change Modeling: Computational Opportunities and Challenges

  • Upload
    be

  • View
    212

  • Download
    0

Embed Size (px)

Citation preview

36 This arTicle has been peer-reviewed. Computing in SCienCe & engineering

P e t a s c a l eC o m p u t i n g

C l i m a t e I m p a c t A n a l y s i s

Climate Change Modeling: Computational Opportunities and ChallengesHigh-fidelity climate models are the workhorses of modern climate change sciences. In this article, the authors focus on several computational issues associated with climate change modeling, covering simulation methodologies, temporal and spatial modeling restrictions, the role of high-end computing, as well as the importance of data-driven regional climate impact modeling.

O ver the past several decades, re-searchers have made significant progress in developing high-fidelity climate models to advance our un-

derstanding of climate systems and improve our ability to project future climate scenarios.1 Mod-els can range from relatively simple radiant-heat transfer models to fully coupled general circula-tion models of the global climate, which discretize and solve the full 3D equations for mass and energy transfer and radiant exchange. Here, we focus on the spatially explicit climate models used to simulate the interactions of Earth’s systems—that is, atmosphere, oceans, land surface, and ice.

Researchers have used spatially explicit climate models for many purposes, from the study of cli-mate system dynamics to temperature change caused by emissions of greenhouse gases (such as carbon dioxide). The complex interactions among some or all major Earth system components result in a component-based model design, in which sev-eral different mathematical/numerical approaches are used to model each Earth system component.

High-performance computing has become essen-tial to address the spatially explicit requirements of Earth system models.

In keeping with the general interests of CiSE readers, we focus here on several computing is-sues associated with climate change modeling. We first review two model systems to demonstrate key design components, and then we explain the un-derlying restrictions associated with the temporal and spatial scale. After that, we discuss the role of high-end computing facilities in climate change sciences and the importance of fostering regional, integrated climate impact analysis.

two simulation methodologiesAlthough some models are customized to study climate change in specific regions of interest, cli-mate models are global models in general. The natural interactions among the various Earth climate system components result in tightly coupled modeling systems. Here, we brief ly describe two modeling efforts: a fully coupled global climate modeling, and a coupled, nested regional climate modeling.

Fully coupled global modelingAn excellent example of the fully coupled global modeling efforts is the National Center for Atmospheric Research’s Community Climate

1521-9615/11/$26.00 © 2011 ieee

CopubliShed by the ieee CS and the aip

Dali Wang, Wilfred M. Post, and Bruce E. WilsonOak Ridge National Laboratory

CISE-13-5-wang.indd 36 8/1/11 5:00 PM

September/oCtober 2011 37

System Model (www.ccsm.ucar.edu). CCSM is a 3D (plus time) computer-based general- circulation model that uses mathematical formu-las to simulate the chemical and physical processes that drive Earth’s climate. Extraordinarily sophis-ticated, CCSM can be used to study phenomena ranging from the effect that an ocean surface tem-perature has on tropical cyclone patterns to the impact of land use on carbon dioxide concentra-tion in the atmosphere. Currently, CCSM con-tains four major community model components to simulate Earth systems: atmosphere, ocean, land, and sea ice. The data exchange (including mass and energy) between each of those system compo-nents is enabled by a coupler, which is a software component that simplifies and expedites system integration. Other CCSM software components include an application driver, timing utility, and parallel I/O. CCSM also provides a complicated system script tool, which allows science-oriented modelers to automatically reconfigure, compile, build, and submit jobs. It’s an extremely valuable feature in helping CCSM users conduct computa-tional experiments on various high-end comput-ers. Figure 1 shows CCSM’s major components.

A fully coupled model system provides an in-tegrated environment for computational ex-periments, where scientists from different fields (such as atmospheric sciences, ocean science, and terrestrial ecosystem sciences) can incorporate new knowledge from their own science domain and explore the potential impact of those dis-coveries on the global climate change. Also, the component-based software design provides a flex-ible and straightforward approach to enable multi-disciplinary development based on solid scientific fundamentals, which is key to sustaining the col-laborations needed for better understanding the long-term global climate change over time (rang-ing from a decade to a century).

nested Regional climate change modelingResearchers used to predict long-term climate independently of short-term weather conditions. Now they’re developing models to investigate the profound impact of weather on climate and vice versa. An excellent example of such an effort is the Nested Regional Climate Model (NRCM; www.nrcm.ucar.edu), in which the Advanced Weather Research and Forecasting model (WRF; www.wrf-model.org) is nested within the CCSM.

WRF is an advanced numerical weather- prediction system designed to serve both opera-tional forecasting and atmospheric research needs. WRF can run at fine spatial resolution (10-km

grid cells) and provides more detailed information on extreme weather events such as hurricanes, heat waves, and thunderstorms. These events are of great importance to local policymakers, but are generally poorly simulated by global climate models, such as version 3.5 of CCSM, because of their coarser resolutions (100-km grid cells). However, nested regional climate change models add a layer of complexity to fully coupled global models and must explicitly address multiscale up-scaling and downscaling issues within each com-ponent of the Earth system.

A nested regional model’s objectives are twofold:

• to provide a natural link between long-term global climate simulations and regional obser-vation and experiment data, which is an essen-tial approach to improve the fidelity of climate change simulations; and

• to provide a natural tool to evaluate the conse-quences of climate change impact on regions of concern.

Figure 2 shows a nested regional climate change model’s integration scheme.

temporal and spatial scale RestrictionsOne of climate change modeling’s challenges is the need to link dynamic models that operate across different spatial scales and at different rates.

Figure 1. Major components of the Community Climate System Model (CCSM). The fully coupled modeling framework provides an integrated computational environment for climate change research. PIO stands for parallel input/output.

Automatic system con�guration,compilation, build and job

submission, etc.

Machine-orientedexecution environment

Computational loop

User de�nedenvironment

Application driver

Land PIO

OceanCouplerIce

PIO

PIO PIOAtmosphere

Systemscript tool

Modelcon�guration

settings

Input/outputdata

setting

Parallelcomputing

settings

CISE-13-5-wang.indd 37 8/1/11 5:00 PM

38 Computing in SCienCe & engineering

This need entails several computational implica-tions, stemming from data sources, system design, model uncertainty, as well as computational capac-ity; we hope our discussion of these issues here can also benefit other modeling research involving multiscale system dynamics.

Data availability and localityAmong the limitations on spatial resolution are data availability and consistency. Although there are many data resources, including multiple sources of global satellite data, many challenges exist in collat-ing data from multiple different sources, adapting the data to desired temporal and spatial resolution, and ensuring consistency between measure-ments of ostensibly the same physical parameter from different instruments and methods. This is a particular challenge when assembling data from multiple sources and measurements to provide the temporal and spatial coverage needed for climate model validation and analysis projects, especially in terms of preventing the injection of reprojection and assimilation biases into the analysis results.

Generally, satellite observations provide data for a given location at a temporal frequency of a few times per day to a few times per month, de-pending on the satellite instrument’s spatial reso-lution. A fine spatial-resolution (30 m) instrument can observe only a given location once every two weeks, while a moderate-to-coarse-resolution in-strument (1 km) might provide two or three obser-vations per day for regions closer to the poles and once every day or two for regions near the equator.

By contrast, at a specific climate change observa-tion or field experiment site, data can be collected at hundreds of data points per day, and with a spa-tial resolution that might be inconsistent with the regular grids of satellite measurements and cli-mate models.

Further work is needed to develop robust methods and data products to bridge these dif-fering data comparison gaps and to advance our understanding of the Earth system in general and of climate change in particular. Beyond that, the implications of data locality should be em-phasized through the parameterization process. For example, researchers have been developing process-based conceptual models to evaluate the ecosystem responses to climate change (such as higher temperature and higher CO2 concentra-tions) at specific sites for decades. Incorporating site-specific observation and experimental data into a spatially explicit climate change model needs further scientific study.2

spatial Heterogeneity and model uncertaintyThe conservation of mass and energy is a fun-damental condition in climate change modeling. A consequence of this conservation is that the coarser the spatial and temporal resolution, the broader the domain over which the physical and chemical properties are averaged in meeting this condition.

Climate is a chaotic system, with a substantial amount of fine-grained detail and localized varia-tion, and climate simulations are sensitive to the initial and boundary conditions. As a result, spa-tial resolution not only directly relates to costs (time) for model calculations, but also impacts the degree of model uncertainty, especially for short-term (less than decadal) simulations. The reason is that short-term climate simulations have a greater dependency on initial conditions than long-term climate simulations, which are gener-ally more sensitive to the boundary conditions. Given this, the choice of a spatial resolution is a balance between model execution time, model predictive capability (including an understanding of regional and local phenomena, such as hurri-canes and precipitation distribution), and the un-certainty inherent in the model results.

One current effort is testing the hypothesis that higher-resolution models are needed to accom-plish the related scientific objectives of improving explicit simulation accuracy for local- to regional-scale phenomena (http://climatechangescience.ornl.gov/content/ultra-high-resolution-global- climate-simulation). Compared to conventional

Land PIO

Ocean

Nested loop

Downscaling/upscaling

Regionalocean model

WRF

Downscaling/upscaling

CCSM computational loop

PIO

PIO

IO IO

PIO

CouplerIce

Atmosphere

Figure 2. An integration procedure between the Advanced Weather Research and Forecasting model (WRF) and the Community Climate System Model (CCSM). The nested modeling framework provides a straightforward approach to integrate global climate models with region models in a common computing environment. PIO stands for parallel input/output.

CISE-13-5-wang.indd 38 8/1/11 5:00 PM

September/oCtober 2011 39

simulations at 1- or 2-degree grid resolution, this ultra-high-resolution global climate simulation system contains an atmosphere model at one-third degree grid resolution (around 30 km), and an ocean model at one-tenth-degree grid resolu-tion (around 10 km). These global simulations require realistic initial conditions and new rep-resentations of physical processes at the fine grid resolution. They also impose great software engi-neering challenges related to the parameterization and performance tuning, as well as parallel I/O (www.cesm.ucar.edu/events/ws.2010/Agendas/ sewg10.pdf ).

numerical characteristics of the simulation systemThe simulation system’s numerical characteristics are also important. For a fixed-period simulation, a small time step requires more calculations and also induces more numerical noise in the system because of small error propagation.

Beyond those inherited artificial noises associ-ated with numerical schemes within each Earth climate system (atmosphere, ocean, land, and so on), the commonly used component-based cou-pling technology requires further examina-tion. On one hand, the model coupling reduces the degrees of freedom and provides a way to validate each Earth climate model component.3 On the other hand, it injects extra numerical er-rors into the simulation system—similar to how the operator-splitting technique does this in a partial differential equation (PDE) solver.4 Also, at a small temporal and spatial scale, we have to investigate the possible numerical impacts in-duced by the order of data exchange (mass, momentum, chemical concentrations, and so on) between each of the coupled Earth climate systems.

The climate change modeling community is a well-coordinated, science-driven community. Based on the component-based software archi-tecture, Earth system model systems—such as CCSM and the new Community Earth System Models (CESM; www.cesm.ucar.edu)—are de-signed to sustain existing components’ innova-tions and also to be flexible enough to incorporate new components (such as a glacier in the CESM). However, the component-based methodology in-troduces many software engineering challenges: constantly updated features within new and ex-isting components might change the computing patterns dramatically and create new difficul-ties for system-wide performance tuning—such as dynamic load balancing related to computing

intensity, I/O and network traffic, and so on. One promising approach could use performance-profiling tools—such as the Performance Appli-cation Programming Interface (PAPI; http://icl.cs.utk.edu/papi), the I/O Tuning and Analysis Tool (IOTA; www.nccs.gov/user-support/center-projects/iota), and Vampir (www.vampir.eu)— to track the basic machine-level computing pat-terns of each component. A software emulator then can be developed for each component and as-sembled together for a system-wide performance study.

computational capacityAnother well-known restriction on the temporal and spatial scale of climate change modeling is computational capacity. Presently, even on a high-end computer, a typical fully coupled ultra-high-resolution CCSM global simulation will take two

weeks for a 100-plus year run (www.ncar.ucar.edu/2009/CISL/2sci/2.1.1.uhrccsm.php).

Besides the complexity of the CCSM itself, many other factors must be taken into account. For example, it’s very challenging to “achieve” the majority of computing performance out of theo-retical peak capacity—although there are few ex-ceptions, including the Linpack benchmark (www. top500.org/project/linpack). Most scientific ap-plications can sustain only a fraction (less than 20 percent) of theoretical peak performance be-cause of various system-related factors such as CPU architecture and compiler limitations.5 In ad-dition to these system-related performance losses, further software-related performance losses are possible due to algorithm parallelization and in-herited scalability limitation related to the climate system’s complexity at both the individual com-ponent level and overall Earth system modeling level. Another limiting factor is the computational resource allocation policy, which is designed to support mission-critical projects or enable high-end computing resource sharing among interna-tional science communities. A good example of the

earth system model systems are designed

to sustain existing components’ innovations

and be flexible enough to incorporate new

components.

CISE-13-5-wang.indd 39 8/1/11 5:00 PM

40 Computing in SCienCe & engineering

allocation policy is available at http://science.energy. gov/~/media/ascr/pdf/incite/docs/Allocation_ process.pdf.

the Role of High-end computing FacilitiesAdvances in global climate change modeling are closely related to the high-end computing faci-lity (HECF). Today, HECF—such as those at Oak Ridge National Laboratory (www.nccs.gov) and the University of Tennessee (www.nics. tennessee.edu)—are in the range of 1015 floating-point operations per second (peta f lops). Such facilities play a foundational role in enabling global climate change modeling at a previously unachievable spatial and temporal resolution.

For example, HECF has enabled major model advancements derived from increasing CCSM models’ spatial resolutions so that we can more accurately simulate small-scale atmospheric and oceanic phenomena, such as tropical cyclones and mesoscale convective complexes. However, more computationally powerful systems at the exaflops scale (1018 flops) can bring new hope to our better understanding the predictability lim-its of global climate models via advanced math-ematical and statistical techniques. Still, climate change modeling has a long journey toward exa-scale computing (most likely based on many-core and acceleration technologies). For example, the spectrum-element-based atmospheric dynamic core with the High Order Method Modeling Environment (HOMME) was rated as one of most difficult applications to take advantage of

acceleration technology. At algorithm level, cli-mate models must

• be more self-contained for the compiler to identify and extract suitable kernels for accel-eration, and

• achieve sufficient concurrency to amortize overhead (http://computing.ornl.gov/HMC/documents/HMC_ORNL_ JT.pdf ).

Nowadays, petaflops systems do open good opportunities for comprehensive uncertainty analysis of climate change at a regional level. It’s a consensus within the climate change community that the uncertainty of regional climate change is higher than that of global climate change. Relying on those petaflops computing facilities, the com-munity is advancing research on model-prediction uncertainty, data assimilation, and model-sensitivity analysis at both the site and regional level.

Over time, researchers have achieved signifi-cant quantities of climate simulation results from high-end computing facilities and established distributed data repositories to support system-atic model analysis and validation. Those efforts create new opportunities for ultra-scale climate data management, visualization, and infrastruc-ture and software development; a good example is the Program for Climate Model Diagnostics and Intercomparison (www-pcmdi.llnl.gov). Figure 3 shows three important roles of high-end com-puting facilities.

Conversely, climate change science has been and still is one of science’s driving high-end computing advancements such as hardware architecture, sys-tem software, networking, visualization, and data storage, as well as parallel algorithm developments and so on. It’s also notable that climate change sci-entists and high-performance computing (HPC) software engineers have been working seamlessly for decades, but the collaborations and interac-tions have been carried out in a sequential mode. Recently, the dramatic changes in the system soft-ware and hardware within high-end computing facilities—that is, many core architecture and ac-celeration technologies—have resulted in a rapid consensus in the two communities: state-of-the-art climate system models must be developed to simul-taneously expedite both scientific discoveries and implementations in high-end computing facilities.

modeling Regional impacts of climate changeClimate change, while global in nature, will have significant implications at the local level.

Figure 3. The role of high-end computing in climate change modeling. High-end computing has direct impacts on the advances on climate change modeling.

High-end computing facility

Model diagnostics, comparison,validation, impact analysis, etc.

Regional models: ensemble runs,uncertainty analysis, etc.

High-resolution global simulations

CISE-13-5-wang.indd 40 8/1/11 5:00 PM

September/oCtober 2011 41

Modeling efforts in this category are helping us un-derstand the possible effects of climate change in high-interest locations, such as coastal communi-ties or important hydrological regions. In contrast to tightly coupled climate change models, many regional models receive one-way data flow from global climate models or observation data and focus on climate impact assessment. It’s an essential com-puting model to leverage broader modeling com-munities, such as hydrology and coastal research.

One of the advantages of data-driven modeling is its interactivity and flexibility to be custom-ized for specific purpose at high-interest regions. Many regional models are customized for specific management purposes and have a direct impact on economic and social activities. Those poten-tial economic, political, and even legal implica-tions frequently require regional models to use a heterogeneous, closed-computing environment. Data-driven models are thus the method of choice to provide a clear modeling boundary between the regional models’ internal activities and external forcings. Furthermore, the data-driven methods let researchers use a wide variety of commercial software (such as Matlab and ArcGIS) and open-source packages (such as the R project for statis-tical computing), and the Geographic Resources Analysis Support System (GRASS)6 to present customized information interactively in near real time, a key feature of any practical decision- support system.

To foster those data-driven climate impact anal-yses, we might need to redesign climate simula-tions scenarios to target practical and management requirements. For example, to provide valuable in-formation for decision making on national security planning, climate models must provide statistically realistic output (such as temperature, precipita-tion, and so on) at a fine spatial resolution (roughly 10 km) across a decade of time (see esipfed.org/sites/default/files/IC_challenges_ Jan2010.ppt).

Because of the relatively automated nature of regional modeling, we see a need for more flex-ibility in the current data infrastructure and user-oriented informatics services; regional modelers especially must be able to make their own data usage decisions based on the temporal and spatial scales needed for the specific regional decision-support requirements. Technically, those efforts might involve online climate data federation and distribution via content management and ontology.7 From a data center perspective, we’re looking forward to Web-based functionality—including interactive data subsetting over specific temporal and spatial ranges, statistical analysis, and high-fidelity

data upscaling and downscaling—enabled by state-of-the-art data center technologies, such as object-based file systems, parallel database sys-tems, and hardware and software optimizations. A good example here is the evolving Earth Sys-tem Grid (www.earthsystemgrid.org), which is designed to federate tremendous climate simula-tion results, large-scale data and analysis serv-ers, and observation data centers such as the US Department of Energy’s Atmospheric Radiation Measurement Program (www.arm.gov), to cre-ate a powerful environment for next-generation climate change research.

A s our understanding of climate change processes increases, we’ll need more complex modeling efforts and richer data streams, which will continue to

challenge our current understanding and de-velopment. From our experience, the hardware

development of high-end computing facilities continuously outpaces the software and applica-tion development; efficiently using those com-puting resources remains a key challenge for the climate change modeling community. Compared to computing efficiency and capacity, climate data problems are far more fragmented and less coor-dinated, and become a real bottleneck to fostering an engaged, productive climate change commu-nity across a variety of disciplinary sciences.

Collaborations between disciplinary scien-tists and computational scientists are critical to enabling advances in climate change modeling. From the computational sciences’ perspective, there are several transformations involved in ad-vanced climate change modeling. First, we must provide not only technical support services to sus-tain a sufficient computational environment, but also develop methodologies to enable new kinds of science-driven research beyond current scientific computing paradigms. Second, we must embrace a more open attitude to foster a broader range of new research opportunities under the umbrella of climate change sciences. Finally, we must ensure that the climate change simulations performed on

data-driven models are the method of choice

to provide a clear modeling boundary between

the regional models’ internal activities and

external forcings.

CISE-13-5-wang.indd 41 8/1/11 5:00 PM

42 Computing in SCienCe & engineering

the high-end computing facilities constitute the next generation of advances for practical man-agement and decision making, not just proof-of- concept computations.

acknowledgmentsPortions of this work were funded by the US Depart-ment of Energy’s Office of Science and NASA’s Sci-ence Mission Directorate.

References1. W.M. Washington and C.L. Parkinson, Introduction

to Three-Dimensional Climate Modeling, Univ. Science

Books, 2005.

2. S.D. Wullschleger and M. Strahl, “Climate Change:

A Controlled Experiment,” Scientific Am., Mar. 2010;

www.scientificamerican.com/article.cfm?id=climate-

change-a-controlled-experiment.

3. K. Li et al., “Development of a Framework for Quantify-

ing the Environmental Impacts of Urban Development

and Construction Practices,” Environmental Sciences

and Technologies, vol. 41, no. 14, 2007, pp. 5130–5136.

4. G.I. Marchuk, “Some Applications of Splitting-Up Meth-

ods to the Solution of Problems in Mathematical Phys-

ics,” Aplikace Matematiky, vol. 1, 1968, pp. 103–132.

5. D. Wang et al., “On Parallelization of a Spatially-

Explicit Structured Ecological Model,” Int’l J. High-

Performance Computer Applications, vol. 20, no. 4,

2006, pp. 571–581.

6. A.R. Ganguly et al., “Higher Trends but Larger

Uncertainty and Geographic Variability in 21st

Century Temperature and Heat Waves,” Proc. US

Nat’l Academy of Sciences, vol. 106, no. 37, 2009,

pp. 15555–15559.

7. P. Fox et al., “Ontology-Supported Scientific Data

Frameworks: The Virtual Solar-Terrestrial Observatory

Experience,” Computers & Geosciences, vol. 35, no. 4,

2009, pp. 724–738.

dali wang is a research scientist at the Environ-mental Sciences Division of Oak Ridge National Laboratory, staff member of ORNL’s Climate Change Sciences Institute and an adjunct professor of geogra-phy at the University of Tennessee, Knoxville. His re-search interests include environmental and climate modeling, environmental data sciences and systems, high-performance scientific computation and opti-mization, geographic information systems, and large-scale system integration and simulation. Wang has a PhD in environmental engineering (with a scien-tific computation focus) from Rensselaer Polytechnic Institute. Contact him at [email protected].

wilfred M. post is a senior scientist in Oak Ridge Na-tional Laboratory’s Environmental Sciences Division, a staff member of ORNL’s Climate Change Science Institute, and an adjunct professor of ecology and evolutionary biology at the University of Tennessee, Knoxville. His research interests are in soil carbon dy-namics, nutrient relationships between soil and veg-etation, and the impact of species composition on ecosystem processes. He has developed approaches to represent the impact of land-use change and cli-mate change in terrestrial biogeochemistry models, and created global data sets for the evaluation of global terrestrial biogeochemistry models. Post has a PhD in ecology from the University of Tennessee, Knoxville. Contact him at [email protected].

bruce e. wilson is the group leader for Oak Ridge Na-tional Laboratory’s Client and Collaboration Tech-nologies Group, a staff member of ORNL’s Climate Change Science Institute, and an adjunct professor of Information Sciences at the University of Tennes-see, Knoxville. His research interests are in scientific informatics, particularly enabling data-intensive sci-ence in ecology and climate science through advanc-ing data storage, curation, distribution, analysis, and visualization technologies and practices. Wilson has a PhD in analytical chemistry from the University of Washington. Contact him at [email protected].

Selected articles and columns from IEEE Computer Society publications are also available for free at

http://ComputingNow.computer.org.

LISTEN TO GRADY BOOCH“On Architecture”

podcast available at http://computingnow.computer.org

CISE-13-5-wang.indd 42 8/1/11 5:00 PM