Grid activities in the Czech Republic Jiří Kosina, Miloš Lokajíček, Jan Švec Institute of...

Preview:

Citation preview

Grid activities in the Czech Republic

Jiří Kosina, Miloš Lokajíček, Jan Švec

Institute of Physics of the Academy

of Sciences of the Czech Republic

http://www.fzu.cz/

HEP Experiments in the Czech Republic (1)

● D0, STAR, ATLAS and ALICE (main experiments only) computing activities:– detector simulations, reconstruction– data analysis in running projects - main participants are

● Institutions – Institute of Physics AS CR, Institute of Nuclear Physics

AS CR – Charles University in Prague – Czech Technical University in Prague

● Facilities – EDG test-bed now being converted to LCG

Institutions involved in the Grid activities

● CESNET (http://www.cesnet.cz/)● Charles University in Prague (http://www.cuni.cz/)

● Czech Technical University in Prague (http://www.cvut.cz/)

● Institute of Physics AS CR (http://www.fzu.cz/)

CESNET(1)

CESNET z.s.p.o.

● association of legal entities created by Czech universities and Academy of Sciences of the CR

● Czech scientific network provider

– shareholder of the Dante

– member of TERENA

– Internet2 international partner

● Computing and networking strategic research projects

– Optical networks and their development

– IPv6 implementation in the CESNET2 network

– Multimedia transfers

– MetaCenter

– Voice services in the CESNET2 network

– QoS in high-speed networks

CESNET(2)

● International projects– GEANT – DataGrid -> EGEE– SCAMPI – 6NET

● Other projects– Infrastructure and technology for on-line education ,

Distributed contact center , Smart NetFlow analyser , Storage over IP , Presentation , RT system in CESNET, Securing CESNET2 local networks , Time synchronization and NTP servers, Platforms for video transmission and production, MEDIMED

CESNET(3)

● 1.2Gbps connection to GEANT (over HW 10 Gbps)● 1+1Gbps connection to NIX (Peering with Czech

ISPs)● 800Mbps connection to USA (Telia)● 2.5 Gbps CzechLight connection (now used for

other tests)– Reserved optical connection 1 Gbps

Institute of Physics <-> CERN – Connected to our linux router, switching to “backup” line

through Internet managed by BGP protocol– Possibility to connect to StarLight in the future

CESNET(4)

● European DataGrid project – WP1 workload management – WP7 network monitoring– Certification authority established for EDG

● issuing certificates to Czech academic subjects

Institute of Physics AS CR - FZU

● D0 experiment participation● ATLAS – main contribution to the hadronic

calorimeter TILECAL and Inner Detector (pixel sensors production and test of strip and pixel detectors), power supplies design and production

● Computing – Mass simulation for D0, reconstruction, data

analysis– ATLAS – participation to Data Challenges,

Atlas-LCG

Institute of Physics AS CR – FZU (2)

● Computing projects– EDG from 1 Jan 2001

● WP6 – testbed● CESNET network support important● Plan to continue in EGEE

– LCG – GDB (LHC Computing Grid – Grid Deployment Board)

● LCG deployed in October 2003 – CERN press release

Institute of Physics AS CR – FZU (3)

● Dedicated server farm for HEP and Grid computing – 34x dual 1.13Ghz PIII,

1TB disk array, currently experimenting with new 10 TB disk array

Institute of Physics AS CR – FZU (4)

● Current status– Construction of the new computing room in the institute

● Designed for 150 kW electric power● UPS, cooling, engine-generator● Construction finished by end 2003 with 50% capacity● Full capacity next year

● Application for triple capacity upgrade of the current farm (30 double Xeon units, 20 TB disk space) for the 2004. Hope in positive result.

Job Management●PBS Pro 5.2●LCG1 -> OpenPBS (will merge soon with PBS Pro)●queues

– shortq – normalq– longq

– d0

– atlas

– alice

– hightest

golias:~$ qsub -q atlas run.sh

D0 MC simulations● SAM station

Sequential Access to data via Metadata

● UPS, UPD management of software products on local systems (UPS) downloading products from product distribution servers (UPD)

● D0-RunII software rel. p14.02.00, p14.05.01

EDG, LCG

● We have installed LCG1 software and currently running LCG1-1_1_1 version of the software.

● LCG – middleware software to provide environment for distributed computing for LHC

● Member of WP6 work package – farm is running EDG 1.4 software, for testbed purposes

LCG (1)

● LCG – LHC (Large Hadron Collider) Computing Grid

● The accelerator will start operation in 2007● 12-14 PetaBytes of data will be generated

every year (20 milion Cds). It is assumed that analyzing this will require approximately 70.000 CPUs

LCG(2)

● Based on EDG software – project funded by EU, finishing by the end of this year

● GRID infrastructure allows every member to submit his job (along with JDL file) to the GRID, and after the computing is finished (no matter where), get the output

● In JDL you can specify requirements specific for task (SW versions, # CPUs, etc)

● Distributed storage of data.

LCG(3)

● Installation of LCG software is done through LCFGng toolkit – useful to manage wide variety of configurations for different hosts

● Joining sites are provided with pregenerated skeleton of configuration files in order to simplify installation procedure (it's necessary to modify them)

● Installation is done simply by enabling PXE boot and rebooting the nodes

10TB disk array and Linux

● We have 10 TB disk array. Problem with Linux 2.4 – only 2TB block devices supported

● Patch for LBD exists for 2.4 kernel, but it collides with another patch needed for LVM.

10TB disk array and Linux

● Kernel 2.6 (not yet stable) supports both LBD and device mapper (needed for LVM). After some hacking of NFS code (our patches are incorporated in 2.6.0-test9 kernel) we have successfully created 6 TB partition (XFS, ext2)

Institutions involved – contacts

● Jan Svec <svecj@fzu.cz> - network and system management, D0 experiment (grid)

● Jiri Kosina <kosina@fzu.cz> - installation of grid software at FZU, ALICE experiment, network and system management

● Jiri Chudoba <Jiri.Chudoba@cern.ch> - ATLAS experiment

● <pcclust@cesnet.cz> - CESNET's PC clusters

Summary – Grid infrastructure

● Basic Grid infrastructure has been established during last year and further expansion due to research or commercial needs in future shouldn't be problematic

● CESNET provides good international network connectivity, plan tests of CzechLight optical network

● Preparing for Data Challenges of LHC experiments in 2004