Upload
millie
View
24
Download
1
Tags:
Embed Size (px)
DESCRIPTION
PolarGrid. CReSIS Lawrence Kansas February 12 2009 Geoffrey Fox (PI) Computer Science, Informatics, Physics Chair Informatics Department Director Digital Science Center and Community Grids Laboratory Indiana University Bloomington IN 47404 [email protected] Linda Hayden (co-PI) ECSU. - PowerPoint PPT Presentation
Citation preview
11
CReSIS Lawrence KansasFebruary 12 2009
Geoffrey Fox (PI)Computer Science, Informatics, Physics
Chair Informatics DepartmentDirector Digital Science Center
and Community Grids LaboratoryIndiana University Bloomington IN 47404
Linda Hayden (co-PI) ECSU
PolarGrid
Support CReSIS with Cyberinfrastructure
Base and Field Camps for Arctic and Antarctic expeditions
Training and education resources at ECSU Collaboration Technology at ECSU Lower-48 System at Indiana University and ECSU to
support off line data analysis and large scale simulations (next stage)• Initially modest system at IU/ECSU for data analysis
22
CYBERINFRASTRUCTURE CENTER FOR POLAR SCIENCE (CICPS)
3
PolarGrid Greenland 2008Base System (Ilulissat Airborne Radar) 8U, 64 core cluster, 48TB external fibre-channel array Laptops (one off processing and image manipulation) 2TB MyBook tertiary storage Total data acquisition 12TB (plus 2 back up copies) Satellite transceiver available if needed, but used wired
network at airport used for sending data back to IU Base System (NEEM Surface Radar, Remote Deployment) 2U, 8 core system utilizing internal hard drives hot swap for
data back up 4.5TB total data acquisition (plus 2 backup copies) Satellite transceiver used for sending data back to IU Laptops (one off processing and image manipulation)
44
CYBERINFRASTRUCTURE CENTER FOR POLAR SCIENCE (CICPS)
5
PolarGrid goes to Greenland
NEEM 2008 Base Station
66
PolarGrid Antarctic 2008/2009Base System (Thwaites Glacier Surface Radar) 2U, 8 core system utilizing internal hard drives hot
swap for data back up 11TB total data acquisition (plus 2 backup copies) Satellite transceiver used for sending data back to IU Laptops (one-off processing and image manipulation)IU-funded Sys-Admin
• 1 admin Greenland NEEM 2008• 1 admin Greenland 2009 (March 2009)• 1 admin Antarctica 2009/2010 (Nov 09 – Feb 2010)• Note that IU effort is a collaboration between research
group and University Information Technology support groups
77
ECSU and PolarGrid Initially A base camp 64-core
cluster, allowing near real-time analysis of radar data by the polar field teams.
An educational videoconferencing Grid to support educational activities
PolarGrid Laboratory for students
ECSU supports PolarGrid Cyberinfrastructure in the field
88
Assistant Professor, Eric Akers, and graduate student, Je’aime Powell, from ECSU
travel to Greenland
PolarGrid Lab Mac OS X Public IP accessible through ECSU firewall Ubuntu Linux Windows XP
Additional Software Desktop Publishing Ubuntu Linux Word Processing Web Design Programming Mathematical Applications Geographic Information Systems (GIS)
Experience from Supporting Expeditions I Base processing (NEEM 2008): 600GB – 1TB on 8
cores ~8-12hours Expeditions are data collection intensive, with goal of
pre-process computing data validation of daily data gathering within 24 hours
Laptops utilized for one-off pre-processing, image manipulation/visualization
Heavy utilization of MatLab for all processing (both pre-processing and full processing)
CReSIS utilizing PolarGrid base cluster for full data processing of all data collected so far
1010
Lessons from field use in expeditions include the necessity of smaller computing engines due to size, weight and power limitations
Greenland 2008 successes have realized PG equipment importance. CReSIS is now utilizing PG gear to store and process 2 additional radar systems data
Smaller system footprint and data management has driven cost per system down.
Complex storage environments are not practical in a mobile data processing environment
Pre-processing data in the field has allowed validation of data acquisition during collection phases
1111
Experience from Supporting Expeditions II
Field Results – 2008/09“Without on-site processing enabled by POLARGRID, we would not have identified aircraft inverter-generated RFI. This capability allowed us to replace these “noisy” components with better quality inverters, incorporating CReSIS-developed shielding, to solve the problem mid-way through the field experiment.”Jakobshavn 2008
NEEM 2008 GAMBIT 2008/09
13
TeraGrid High Performance Computing Systems 2007-2008
Computational Resources (size approximate - not to scale)
Slide Courtesy Tommy Minyard, TACC
SDSC
TACC
NCSA
ORNL
PU
IU
PSC
NCAR
(504TF)
2008(~1PF)
Tennessee
LONI/LSU
UC/ANL
Future Features of PolarGrid PolarGrid will allow all of CReSIS access to TeraGrid to help
large scale computing PolarGrid will support the CyberInfrastructure Center for
Polar Science concept (CICPS) i.e. the national distributed collaboration to understand ice sheet science
Cyberinfrastructure levels the playing field in research and learning
Students and faculty can contribute based on interest and ability – not on affiliation
PolarGrid will be configured as a cloud for ease of use – virtual machine technology especially helpful for education
PolarGrid portal will use Web2.0 style tools to support collaboration
ECSU can use PolarGrid to enhance local facilities and Internet2 connectivity
1414