RRB ATLAS Progress Report, CERN-RRB-2006-109 23rd October 20061 CERN-RRB-2006-109 23 rd October 2006 ATLAS Progress Report Collaboration and management

Embed Size (px)

DESCRIPTION

RRB ATLAS Progress Report, CERN-RRB rd October New Institutions unanimously admitted by the ATLAS Collaboration Fachhochschule Wiener Neustadt (FHWN), Wiener Neustadt, Austria (Technical expertize in system integrations, Grid computing) University of Regina, Physics Department, Regina, Canada (Software tools, LAr calibrations and commissioning) DESY (Hamburg and Zeuthen), Germany (HLT, Grid computing, shower simulations) Humboldt University Berlin, Institute of Physics, Berlin, Germany (HLT, commissioning, computing, working very closely with DESY) Nagoya University, Department of Physics, Nagoya, Japan (TGC trigger and DAQ) New York University, Department of Physics, New York, U.S.A. (HLT algorithms for level-2 and EF, commissioning, power systems for upgrades) SLAC, Stanford, U.S.A. (Pixels – hard and software, HLT, simulations, Grid computing) The RRB is kindly requested to endorse the admission of these seven new Institutions in the ATLAS Collaboration

Citation preview

RRB ATLAS Progress Report, CERN-RRB rd October CERN-RRB rd October 2006 ATLAS Progress Report Collaboration and management Construction status of the detector systems (Common Projects and installation: see Marzio Nessis presentation) Milestones and schedule Brief account on other activities Computing and physics preparation Status of Completion Planning Conclusions RRB ATLAS Progress Report, CERN-RRB rd October Collaboration composition Since the last RRB in April 2006 seven Expressions of Interests to join the ATLAS Collaboration have been concluded with unanimous admission votes at the Collaboration Boards of 14 th July and 6 th October For them the discussions and negotiations for their contributions have been constructive and mutually beneficial This means in particular that in each case necessary technical service tasks and contributions have been identified, besides involvements in physics For a number of other groups we have encouraged them to join forces at this stage with existing ATLAS Institutions (in addition some other contacts have not been pursued) There are no pending Expressions of Interest on the time scale of the April 2007 RRB The Collaboration took also note of the withdrawal of Naruto University of Education, Tokushima, Japan, which has completed its initially expected contribution to ATLAS (GEANT4 development work) RRB ATLAS Progress Report, CERN-RRB rd October New Institutions unanimously admitted by the ATLAS Collaboration Fachhochschule Wiener Neustadt (FHWN), Wiener Neustadt, Austria (Technical expertize in system integrations, Grid computing) University of Regina, Physics Department, Regina, Canada (Software tools, LAr calibrations and commissioning) DESY (Hamburg and Zeuthen), Germany (HLT, Grid computing, shower simulations) Humboldt University Berlin, Institute of Physics, Berlin, Germany (HLT, commissioning, computing, working very closely with DESY) Nagoya University, Department of Physics, Nagoya, Japan (TGC trigger and DAQ) New York University, Department of Physics, New York, U.S.A. (HLT algorithms for level-2 and EF, commissioning, power systems for upgrades) SLAC, Stanford, U.S.A. (Pixels hard and software, HLT, simulations, Grid computing) The RRB is kindly requested to endorse the admission of these seven new Institutions in the ATLAS Collaboration RRB ATLAS Progress Report, CERN-RRB rd October ATLAS Collaboration (As of the October 2006) 35 Countries 164 Institutions 1800 Scientific Authors total (1470 with a PhD, for M&O share) Albany, Alberta, NIKHEF Amsterdam, Ankara, LAPP Annecy, Argonne NL, Arizona, UT Arlington, Athens, NTU Athens, Baku, IFAE Barcelona, Belgrade, Bergen, Berkeley LBL and UC, HU Berlin, Bern, Birmingham, Bologna, Bonn, Boston, Brandeis, Bratislava/SAS Kosice, Brookhaven NL, Buenos Aires, Bucharest, Cambridge, Carleton, Casablanca/Rabat, CERN, Chinese Cluster, Chicago, Clermont-Ferrand, Columbia, NBI Copenhagen, Cosenza, AGH UST Cracow, IFJ PAN Cracow, DESY, Dortmund, TU Dresden, JINR Dubna, Duke, Frascati, Freiburg, Geneva, Genoa, Giessen, Glasgow, LPSC Grenoble, Technion Haifa, Hampton, Harvard, Heidelberg, Hiroshima, Hiroshima IT, Indiana, Innsbruck, Iowa SU, Irvine UC, Istanbul Bogazici, KEK, Kobe, Kyoto, Kyoto UE, Lancaster, UN La Plata, Lecce, Lisbon LIP, Liverpool, Ljubljana, QMW London, RHBNC London, UC London, Lund, UA Madrid, Mainz, Manchester, Mannheim, CPPM Marseille, Massachusetts, MIT, Melbourne, Michigan, Michigan SU, Milano, Minsk NAS, Minsk NCPHEP, Montreal, McGill Montreal, FIAN Moscow, ITEP Moscow, MEPhI Moscow, MSU Moscow, Munich LMU, MPI Munich, Nagasaki IAS, Nagoya, Naples, New Mexico, New York, Nijmegen, BINP Novosibirsk, Ohio SU, Okayama, Oklahoma, Oklahoma SU, Oregon, LAL Orsay, Osaka, Oslo, Oxford, Paris VI and VII, Pavia, Pennsylvania, Pisa, Pittsburgh, CAS Prague, CU Prague, TU Prague, IHEP Protvino, Regina, Ritsumeikan, UFRJ Rio de Janeiro, Rochester, Rome I, Rome II, Rome III, Rutherford Appleton Laboratory, DAPNIA Saclay, Santa Cruz UC, Sheffield, Shinshu, Siegen, Simon Fraser Burnaby, SLAC, Southern Methodist Dallas, NPI Petersburg, Stockholm, KTH Stockholm, Stony Brook, Sydney, AS Taipei, Tbilisi, Tel Aviv, Thessaloniki, Tokyo ICEPP, Tokyo MU, Toronto, TRIUMF, Tsukuba, Tufts, Udine, Uppsala, Urbana UI, Valencia, UBC Vancouver, Victoria, Washington, Weizmann Rehovot, FH Wiener Neustadt, Wisconsin, Wuppertal, Yale, Yerevan RRB ATLAS Progress Report, CERN-RRB rd October Management and Collaboration Board Following the standard procedures and schedule, the Collaboration Board has elected a new Deputy Collaboration Board Chairperson, who will then become CB Chair afterwards Kerstin Jon-And (Stockholm University) Deputy CB Chair 2007 (and 2010), CB Chair 2008 2009 She will replace Siegfried Bethke (MPI Munich) whose term of office finishes at the end of this year The Collaboration Board has also endorsed the re-appointments for the term of office March 2007 to February 2009 for Marzio NessiTechnical Coordinator Markus NordbergResources Coordinator The CERN Management has approved formally these appointments Further appointments in managerial positions are included in the following organization chart RRB ATLAS Progress Report, CERN-RRB rd October ATLAS Organization October 2006 ATLAS Plenary Meeting Collaboration Board (Chair: C. Oram Deputy: S. Bethke) Resources Review Board Spokesperson (P. Jenni Deputies: F. Gianotti and S. Stapnes) Technical Coordinator (M. Nessi) Resources Coordinator (M. Nordberg) Executive Board CB Chair Advisory Group Inner Detector (L. Rossi, K. Einsweiler P. Wells, F. Dittus) Tile Calorimeter (B. Stanek) Magnet System (H. ten Kate) Computing Coordination (D. Barberis, D. Quarrie) Data Prep. Coordination (C. Guyot) LAr Calorimeter (H. Oberlack, D. Fournier, J. Parsons) Muon Instrum. (G. Mikenberg, F. Taylor, S. Palestini) Trigger/DAQ ( N. Ellis, L. Mapelli) Electronics Coordination (P. Farthouat) Physics Coordination (I. Hinchliffe) Additional Members (H. Gordon, A. Zaitsev) RRB ATLAS Progress Report, CERN-RRB rd October Diameter 25 m Barrel toroid length 26 m End-cap end-wall chamber span 46 m Overall weight 7000 Tons Construction progress of the detector systems (The Common Projects and installation will be covered by M Nessi) ATLAS superimposed to the 5 floors of building 40 RRB ATLAS Progress Report, CERN-RRB rd October The Underground Cavern at Pit-1 for the ATLAS Detector Length = 55 m Width= 32 m Height= 35 m Side ASide C RRB ATLAS Progress Report, CERN-RRB rd October Inner Detector (ID) The Inner Detector (ID) is organized into four sub-systems: Pixels ( channels) Silicon Tracker (SCT) ( channels) Transition Radiation Tracker (TRT) ( channels) Common ID items RRB ATLAS Progress Report, CERN-RRB rd October Inner Detector progress summary Pixels: Barrel layer-2 has been integrated Low mass Al cables (from modules to first patch panel) had low yield (broken insulator). Solved with new production. Integration schedule tight, but speed is now higher than planned. Barrel: SCT and TRT barrel integrated in SR1. Tested with cosmics (no x-talk observed). Installed in the pit. Weighing demonstrates good understanding of material. EC: SCT ECC has been integrated very recently with TRT ECC after all tests were done on sub-assemblies. SCT ECA is dressing its thermal enclosures and will be ready for integration with TRT by mid November. The schedule is driven by SCT ECA. The schedule for the Inner Detector remains very tight, without any float left (critical path: Installation and sign-off in the pit) Barrel TRT TRT+SCT barrel completed in SR1 RRB ATLAS Progress Report, CERN-RRB rd October ID TRT + SCT barrel tested in SR1 One-eighth of the TRT and one-quarter of the SCT were equipped with complete readout chains Dead channels: 0.2% SCT, 1.5% TRT Noise level as for the individual parts and below specs (e.g. SCT random noise prob. is , spec = ) No cross talk measured (many trials done) cosmics trigger taken TRT %noise occupancy before-after insertion Side view of a cosmic track trough TRT and SCT, noise is small RRB ATLAS Progress Report, CERN-RRB rd October ID barrel travels to the pit, 24 th Aug 2006 Through the parking area A tight fit between BT and EC Calorimeter From the trolley to the support railsInside cryostat RRB ATLAS Progress Report, CERN-RRB rd October ID End-Caps TRT + SCT integration of EC-C was done end of September, the A side will follow in November SCT ECC, in front of its outer thermal enclosure EC-C integration TRT + SCT RRB ATLAS Progress Report, CERN-RRB rd October All modules have been delivered with good yield Both EC have been integrated, delivered to CERN and acceptance-tested One EC will now go through cosmics tests Barrel stave production did finish mid September (including corrosion leak repairs) Layer-2 has been fully integrated, the two Layer-1 half-shells are finished, and about 1/3 of the B-layer bi-staves assembled The best staves (least dead channels, best thermal performance) are reserved for the b-layer A new potential issue under investigation are failing opto-boards (integrated in service panels) Pixel ECC at CERN, 3 disks visible Pixels RRB ATLAS Progress Report, CERN-RRB rd October Pixel Layer-2 half shell Pixel Layer2, once clamped, outside Pixel Layer2, once clamped, inside Ready for installation date is 1 st April 2007 RRB ATLAS Progress Report, CERN-RRB rd October LAr and Tile Calorimeters Tile barrelTile extended barrel LAr forward calorimeter (FCAL) LAr hadronic end-cap (HEC) LAr EM end-cap (EMEC) LAr EM barrel RRB ATLAS Progress Report, CERN-RRB rd October Calorimeter barrel after its move into the center of the detector (4 th November 2005) Barrel LAr and Tile Calorimeters The barrel calorimeters are in their final position at the centre of the detector since November 2005 The final cool-down of the LAr cryostat took place over April and May 2006 RRB ATLAS Progress Report, CERN-RRB rd October LAr barrel history over the past months June: Barrel filled with LAr Tried burning of a few shorts in the barrel calorimeter in some modules Results positive on Presampler, essentially no difference on Calorimeter July: Decide to empty / refill by condensation Refilling operation took 20 days HV status (at an early stage of commissioning) 1600 V on Calorimeter, 2000 V on Presampler, leaving out known problematic channels: Status on Calorimeter: 2 sectors HV shorts, 10 sectors working with half of the signal, out of 448 independent sectors Status on presampler: 7 problems, will try to burn them Problematic channels: will be dealt with separately Plans: Leave calorimeter off when not needed, put 1600 V on 6 8 modules needed for cosmics running Stable pressure in expansion vessel Impurity level: Measured with four purity cells: ( ) ppm O 2 Temperature stability: Tmin = 88.2 K Tmax = 88.6 K Detector sub-cooled between 5.8 K and 8.6 K RRB ATLAS Progress Report, CERN-RRB rd October End-Cap LAr and Tile Calorimeters The end-cap calorimeters on side C were assembled in the cavern by end of January 2006, and then the end-cap on side A followed in May 2006 Main LAr activities and plans for the end-caps EC-A: - Since August installation of FE electronics (no LVPS yet) - November 2006 start cool down - February 2007 start cold operation EC-C: - Since April installation of FE electronics, then switched to EC-A - February 2007 start cool down - April 2007 start cold operation Completed end-cap calorimeter side C, just before insertion into the detector RRB ATLAS Progress Report, CERN-RRB rd October Calorimeter electronics The installation of the LAr Front End (FE) electronics on the detector, as well as of the Back End (BE) read-out electronics in the control room, is proceeding to plans (all production is very close to be finished) A major concern are still the in-time availability, and the reliability of the low and (to a lesser extent) the high voltage LAr power supplies For the Tile Calorimeter, a control problem for the low voltage supplies has been understood, and a corrective action is being implemented (but impact commissioning) Both addressed in detail with the LHCC referees LAr barrel ROD system in USA15 LAr FE crate RRB ATLAS Progress Report, CERN-RRB rd October Detailed commissioning work has started some examples Muons in cosmics Noise studies day to day work to track coherent noise Calibration pulse studies % amplitude stability over one month No increase of coherent noise when solenoid field is on S/B =11 First high energy ionization signal RRB ATLAS Progress Report, CERN-RRB rd October Correlation between LAr Middle & Front layer Event display from the first LAr + Tile Calorimeter barrel cosmics run RRB ATLAS Progress Report, CERN-RRB rd October Muon Spectrometer Instrumentation Precision chambers: - MDTs in the barrel and end-caps - CSCs at large rapidity for the innermost end-cap stations Trigger chambers: - RPCs in the barrel - TGCs in the end-caps The Muon Spectrometer is instrumented with precision chambers and fast trigger chambers A crucial component to reach the required accuracy is the sophisticated alignment measurement and monitoring system At the end of February 2006 the huge and long effort of series chamber production in many sites was completed for all chamber types RRB ATLAS Progress Report, CERN-RRB rd October Extrapolation assumes 3.7 chambers per day. The main problem has been access and crane availability, more than chamber availability or the actual installation of the chambers on the rails Lots of detailed small problems need to be solved inside the detector when moving the chambers to their final positions: services out of envelope, poor access, scaffolding in front of the chambers, etc. Barrel Muon Chamber installation: Almost 80 % installed today RRB ATLAS Progress Report, CERN-RRB rd October August 2006 saw the first combined MDT + RPC + Tile Calorimeter cosmic ray muon run RPC trigger on sector-13 RRB ATLAS Progress Report, CERN-RRB rd October Assembly of End-Cap Big Wheel sectors in Hall 180 Assembly progress in 2006: Sectors for TGC-1-C: completed by April 7 (~10 days/sector in 2006) Sectors for MDT-C: completed by May 23 (~12 days/sector in 2006) Sectors for TGC-2-C: completed between May 1 and Aug 29 (7 days/sector over most of the assembly period) Sectors for MDT-A finished within few weeks, TGC-3-C well advanced There are in total for both sides 6 TGC and 2 MDT Big Wheels, requiring 72 TGC and 32 MDT sectors RRB ATLAS Progress Report, CERN-RRB rd October First TGC Big-Wheel assembled in the cavern early September 2006 RRB ATLAS Progress Report, CERN-RRB rd October SDX1 USA15 UX15 ATLAS Trigger / DAQ Data Flow ATLAS detector Read- Out Drivers ( RODs ) First- level trigger Read-Out Subsystems ( ROSs ) UX15 USA15 Dedicated links Timing Trigger Control (TTC) 1600 Read- Out Links Gigabit Ethernet RoI Builder pROS Regions Of Interest VME ~150 PCs Data of events accepted by first-level trigger Event data requests Delete commands Requested event data stores LVL2 output Event data 100 kHz, 1600 fragments of ~ 1 kByte each Second- level trigger LVL2 Super- visor SDX1 CERN computer centre DataFlow Manager Event Filter (EF) pROS ~ 500~1600 stores LVL2 output dual-CPU nodes ~100~30 Network switches Event data pulled: partial 100 kHz, full ~ 3 kHz Event rate ~ 200 Hz Data storage Local Storage SubFarm Outputs (SFOs) LVL2 farm Network switches Event Builder SubFarm Inputs (SFIs) RRB ATLAS Progress Report, CERN-RRB rd October Level-1 The level-1 system (calorimeter, muon and central trigger logics) is in the production and installation phases for both the hardware and software The muon trigger sub-system faces a very tight schedule for the on-chamber components as reported before, but is proceeding satisfactorily RRB ATLAS Progress Report, CERN-RRB rd October Tile calorimeter test-pulse signal recorded through LVL1 Pre-processor LVL1 calorimeter trigger Installation in the underground counting room is in progress Cabling, patch panels, tests with test- pulse signals from calorimeters, etc. Also integration with DAQ, HLT and LVL1 CTP Full-crate tests of pre-production modules are almost completed Preprocessor & ROD modules are the most schedule-critical items Most modules now in production Pre-Processor 1/8 Cluster Processor 1/4 Analogue signal cables in USA15 RRB ATLAS Progress Report, CERN-RRB rd October Barrel Trigger Sector 13: Extrapolation of RPC cosmic-ray tracks to ground level ATLAS shafts TGC detectors with on-detector trigger electronics in cavern Trigger rate ~60 Hz consistent with simulation of cosmic rays in corresponding configuration LVL1 muon trigger RRB ATLAS Progress Report, CERN-RRB rd October HLT/DAQ/DCS The High Level Trigger (HLT), Data Acquisition (DAQ) and Detector Control System (DCS) activities have continued to proceed according to plans Large scale system tests, involving up to 800 nodes, have further demonstrated the required system performance and scalability Scalability is particularly important for staging needs during the initial running of ATLAS A major emphasis was put on all aspects of the HLT and DAQ software developments Components of the DCS are in fabrication or already finished (ELMB), and are already widely used, and the s/w components are available The DCS is one of the first systems already in operation at Pit-1 Example of performance optimization RRB ATLAS Progress Report, CERN-RRB Installation & commissioning - Read-Out System All 153 ROSs installed and standalone commissioned Each ROS PC is equipped with the final number of ROBIN cards (700 in total including spares) 44 of them connected to RODs and fully commissioned These are the full LAr-barrel, 1/2 of Tile and the CTP Taking data regularly with final DAQ Event building at the ROS level using the control network Commissioning of other detector read-outs driven by RODs installation Expect to complete most of it by end 2006 ROBIN RRB ATLAS Progress Report, CERN-RRB rd October DAQ/HLT pre-series system Pre-series system at Point-1 continues to be extensively used For measurements, assessment and validation HLT algorithms started to be used as well Thanks to substantial progress in complex software integration process Using physics data-sets pre-loaded in ROSs Egamma, muon, tau and jet algorithms have been integrated for the first time online (release ) 24-hr DAQ/HLT-runs regularly organised Use full chain as if it was an ATLAS run Force to focus on operational issues Increase expertise Reveal problems not seen on sub- system testing Extremely valuable! RRB ATLAS Progress Report, CERN-RRB A total of ~100 racks / 2500 highest-performance multi-core PCs in final system - First 50 machines of Event Builder and HLT infrastructure are being installed - First 4 HLT racks (~120 computing nodes) follow in early 2007 Installation & commissioning - SDX1 (surface HLT/DAQ room) RRB ATLAS Progress Report, CERN-RRB rd October Construction issues and risks (Top-Watch List) A list of these issues is monitored monthly by the TMB and EB, and it is publicly visible on the Web, including a description of the corrective actions undertaken:LHCC milestones evolution RRB ATLAS Progress Report, CERN-RRB rd October ATLAS Installation Activities (Working Schedule) - Beam pipe in place end of August Restricted access to complete end-wall muon chambers and global commissioning until Nov Ready for collisions from Nov 2007 RRB ATLAS Progress Report, CERN-RRB rd October Commissioning plans (overview) Integration of experiment Global aim: ATLAS operational in summer 2007 First milestone: initial ATLAS core operational in fall 2006 Participants Barrel calorimeters (with at least a minimal geometry) DAQ Central DCS Online DataBases Control room Common trigger using TTC, LTP, CTP Additional ingredients Monitoring system, combined monitoring A cosmic trigger for real particles in the detector Offline analysis RRB ATLAS Progress Report, CERN-RRB rd October ATLAS forward detectors Being developed since the encouragements after the LHCC LoI CERN/LHCC/ : Roman Pots: Absolute luminosity measurement LUCID: Cherenkov light luminosity monitor LoI to be submitted to the LHCC after the internal review is concluded (aim for February 2007): Zero Degree Calorimeter (ZDC)Instrumentation of the TAN for HI physics and beam tuning (Working contacts with LHCf) Future evolutions, to pass through ATLAS first, and then LHCC: Integration of so-called FP420 (the ATLAS participants) into the ATLAS forward detector and physics programme Note:ATLAS forward detector and physics efforts are treated as an integral part of ATLAS in all aspects RRB ATLAS Progress Report, CERN-RRB rd October ATLAS organization to steer R&D for upgrades ATLAS has put in place a structure to steer its planning for future upgrades, in particular for R&D activities needed for possible luminosity upgrades of the LHC (SLHC) The main goals are to Develop a realistic and coherent upgrade plan addressing the physics potential Retain detector experts in ATLAS with challenging developments besides detector commissioning and running Cover less attractive (but essential) aspects right from the beginning The organization has two major coordination bodies Upgrade Steering Group (USG) (Existing since June 2004, with representatives from systems, software, physics, and relevant Technical Coordination areas) Project Office (UPO) (New body, fully embedded within the Technical Coordination) RRB ATLAS Progress Report, CERN-RRB rd October RRB ATLAS Progress Report, CERN-RRB rd October Areas to be addressed by Upgrade Project Office overall mechanical design, drawings and layout control Reviews and R&D follow-up planning of services electronics coordination installation scenarios, scheduling radiation, shielding, activation interface to machine Engineers/technicians in project office are expected to be part-time active in ATLAS operations Define work packages to be taken up by groups outside of CERN (under project office coordination) ATLAS SLHC R&D projects There is a reviewing and approval procedure in place, and first proposals have been internally approved, and others are in the pipe-line There is good communication with CMS upgrade studies to benefit from common approaches However, there is no ambiguity, ATLAS priority is to complete, commission and exploit the TDR detector ! RRB ATLAS Progress Report, CERN-RRB rd October ATLAS Computing Timeline 2003 POOL/SEAL release (done) ATLAS release 7 (with POOL persistency) (done) LCG-1 deployment (done) ATLAS complete Geant4 validation (done) ATLAS release 8 (done) DC2 Phase 1: simulation production (done) DC2 Phase 2: intensive reconstruction (done) Combined test beams (barrel wedge) (done) Computing Model paper (done) Computing Memorandum of Understanding (done) ATLAS Computing TDR and LCG TDR (done) Start of Computing System Commissioning (in progress) Physics Readiness Documents (re-scheduled: early 2007) Start cosmic ray run GO! RRB ATLAS Progress Report, CERN-RRB rd October The computing and software suite has progressed on a very broad front, with a particular emphasis to make it as accessible as possible to the user community Examples:GRID production tools Software infrastructure Detector Description and graphics Framework and Event Data Model Simulation Tracking (ID and Muons) and calorimeters (LAr and Tiles) Database and data management Reconstruction and Physics Analysis tools Distributed analysis Computing System Commissioning (CSC) along sub-system tests with well-defined goals, preconditions, clients and quantifiable acceptance tests Examples:Full Software Chain From generators to physics analysis Tier-0 Scaling Calibration & Alignment Trigger Chain & Monitoring Distributed Data Management Distributed Production (Simulation & Re-processing) (Distributed) Physics Analysis General rehearsal of TDAQ/Offline data flow and analysis ATLAS computing is fully embedded in, and committed to, the WLCG framework Special issues have been addressed in task forces ExamplesLuminosity block structure Data Streaming Model RRB ATLAS Progress Report, CERN-RRB rd October Example 1: daily production jobs over the past couple of months Production for software validation and CSC physics samples Some statistics June now: Over 50 Million events produced EGEE grid59 % NorduGrid13 % OSG28 % RRB ATLAS Progress Report, CERN-RRB rd October DDM Operations: T0->T1 s Data flow to 9 Tier-1s No direct data flow from T0 to Tier-2s (ATLAS Computing Model) NorduGrid to be integrated into Distributed Data Management (DDM) system Total data copied so far: 1.6 PB (1 PB = 10^15 Bytes) DDM is critical, and needs full functionality urgently Example 2: data flow tests over the past few months RRB ATLAS Progress Report, CERN-RRB rd October Executive Board ATLAS management: SP, Deputy SP, RC, TC Collaboration Management, experiment execution, strategy, publications, resources, upgrades, etc. Publication Committee, Speaker Committee CB Detector Operation (Run Coordinator) Detector operation during data taking, online data quality, Trigger (Trigger Coordinator) Trigger data quality, performance, menu tables, new triggers,.. Data Preparation (Data Preparation Coordinator) Offline data quality, first reconstruction of physics objects, calibration, alignment (e.g. with Z ll data) Computing (Computing Coordinator) Core Software, operation of offline computing, Physics (Physics Coordinator) optimization of algorithms for physics objects, physics channels Figure 2 (Sub)-systems: Responsible for operation and calibration of their sub-detector and for sub-system specific software TMB Operation Model (Organization for LHC Exploitation) (Details can be found at)http://uimon.cern.ch/twiki//bin/view/Main/OperationModel The DP activity is now starting within the context of the Operation Model RRB ATLAS Progress Report, CERN-RRB rd October Obtain final set of corrections, alignment and calibration constants Compare performance of as-installed mis-aligned detector after calibration and alignment to nominal (TDR) performance Exercise (distributed) infrastructure: Condition DB, bookkeeping, etc. A blind test: learn how to do analysis w/o a priori information 24h latency test: calibration constants for 1 st pass data reconstruction at Tier0 Geometry of as-installed mis-aligned detector G4-simulation of calibration samples [O(10M) events, e.g. Z ll] Reconstruction pass N (Release 13, Feb. 07) Analysis Calib/align constants pass N Condition DataBase Calib/align constants from pass N-1 Pass 1 assumes perfect calibration/alignment and nominal material In Release (current) Example of preparations towards the physics exploitation: Calibration Data Challenge (CDC) RRB ATLAS Progress Report, CERN-RRB rd October Some details for experts: Generate O(10 7 ) evts: few days of data taking, ~1 pb -1 at L = cm -2 s -1 Filter events at MC generator level to get physics spectrum expected at HLT output Pass events through G4 simulation (realistic as installed detector geometry) Mix events from various physics channels to reproduce HLT physics output Run LVL1 simulation (flag mode) Produce byte streams emulate the raw data Send raw data to Point 1, pass through HLT nodes (flag mode) and SFO, write out events by streams, closing files at boundary of luminosity blocks. Send events from Point 1 to Tier0 Perform calibration & alignment at Tier0 (also outside ?) Run reconstruction at Tier0 (and maybe Tier1s ?) produce ESD, AOD, TAGs Distribute ESD, AOD, TAGs to Tier1s and Tier2s Perform distributed analysis (possibly at Tier2s) using TAGs MCTruth propagated down to ESD only (no truth in AOD or TAGs) A complete exercise of the full chain from trigger to (distributed) analysis, to be performed in 2007, a few months before data taking starts Ambitious goals need to plan it carefully (both in terms of effort needed and of technical issues and implications) Looking further ahead: The Dress Rehearsal RRB ATLAS Progress Report, CERN-RRB rd October Physics Coordination started to address goals of the 2007 run 30% data taking efficiency included (machine plus detector) Trigger and analysis efficiencies included Start to commission triggers and detectors with LHC collision data (minimum bias, jets,..) Maybe first physics measurements (minimum-bias, underlying event, QCD jets, ) ? Observe a few W l, , J/ ? s =900 GeV, L = cm -2 s -1 Jets p T > 15 GeV Jets p T > 50 GeV Jets p T > 70 GeV W e, Z ee, J/ 100 nb nb million minimum-bias/day (b-jets: ~1.5%) Interaction rate ~10kHz RRB ATLAS Progress Report, CERN-RRB rd October Cost to Completion, and initial staged detector configuration As a reminder from previous RRB meetings: The Cost to Completion (CtC) is defined as the sum of Commissioning and Integration (C&I) pre-operation costs plus the Construction Completion (CC) cost in addition to the deliverables The following framework was accepted at the October 2002 RRB (ATLAS Completion Plan, CERN-RRB rev.): CtC68.2 MCHF(sum of CC = 47.3 MCHF and C&I = 20.9 MCHF) Commitments from Funding Agencies for fresh resources (category 1)46.5 MCHF Further prospects, but without commitments at this stage (category 2)13.6 MCHF The missing resources, 21.7 MCHF, have to be covered by redirecting resources from staging and deferrals The funding situation will be reviewed regularly at each RRB, and is expected to evolve as soon as further resources commitments will become available The physics impact of the staging and deferrals was discussed in detail with the LHCC previously It had to be clearly understood that the full potential of the ATLAS detector will need to be restored for the high luminosity running, which is expected to start only very few years after turn-on of the LHC, and to last for at least a decade RRB ATLAS Progress Report, CERN-RRB rd October Updated Cost to Completion estimates The RRB was informed in the April 2006 meeting that the ATLAS management is re-evaluating the financial situation and evolution since the CtC estimates accepted in October 2002 The situation is that there are new overcosts projected at the level of 4.4 MCHF for the completion, over the 68 MCHF estimated in 2002 Further delays in installation work beyond August 2007 would require additional resources for manpower to be paid (order 200 400 kCHF per month) Some corrections to the initial CtC estimates are required in the areas of the magnet system, the LAr cryogenics, and the infrastructure and installation activities (manpower to meet the schedule) SystemItem Item Over Run System Over Run System Total Cost System Over Cost MCHF % Magnet % Technical Coordination % Muon Big Wheels1.39 TCn installation manpower efforts at Point LArCC project % ATLAS % Not initially part of TCn Largely due to the engineering contracts Workforce not available from CERN and Institutes RRB ATLAS Progress Report, CERN-RRB rd October Main funding issues today There are outstanding contributions to the baseline & Common Fund at risk 9 MCHF Furthermore, not all the calculated 2002 CtC (CC and C&I) shares have been pledged, in fact the situation only looks quite good because CERN has committed 5 MCHF more than its calculated share 11 MCHF The following table shows the details Strategy proposed to the RRB to cover the remaining funding gap, including the new CtC 1)Expect all outstanding baseline and Common Fund contributions according to the Construction MoU 2)Urge all FAs to pledge their full CtC share as determined in October 2002 As CERN has committed 5 MCHF above its calculated share, this would cover the new 4.4 MCHF additional CtC costs 3)As a fallback, extend the annual member fee for one or two years more (2007 and 2008) The present budget request for 2007 includes this as an option, to be decided by the RRB in its April 2007 meeting, should it become necessary Clearly, a strong solidarity from all funding partners is needed to overcome this last financial hurdle! RRB ATLAS Progress Report, CERN-RRB rd October Status of the Cost to Completion funding (CERN-RRB ) RRB ATLAS Progress Report, CERN-RRB rd October Financial Overview Financial framework Initial Construction MoU MCHF Updated construction baseline MCHF Additional Cost to Completion (accepted in RRB October 2002) 68.2 MCHF based on the Completion Plan (CERN-RRB ) Additional CtC identified (mentioned at the last RRB, and now announced in CERN-RRB ) 4.4 MCHF Total costs for the initial detector541.1 MCHF Note that not included are: - This assumes beam pipe closure end August 2007, later dates would imply additional manpower costs of kCHF per month - No provision for future force majeure cost overruns - Restoration of the design-luminosity detector, estimated material costs of parts not included in present initial detector (CERN-RRB ) 20 MCHF - Forward detectors parts (luminosity) not funded yet 1 MCHF Missing funding at this stage Baseline Construction MoU, mainly Common Fund 9 MCHF 2002 Cost to Completion (CC and C&I) calculated shares 11 MCHF Not established funding mechanism yet for the new CtC MCHF (proposed at this RRB to be covered by the + 5 MCHF CERN CtC pledged in 2002, or by extending ATLAS member fee by 2 more years) RRB ATLAS Progress Report, CERN-RRB rd October Conclusions The ATLAS project is proceeding within the framework of the accepted 2002 Completion Plan, and all the resources requested in that framework are needed now to complete the initial detector Many important milestones have been passed in the construction, pre-assembly, integration and installation of the ATLAS detector components The most critical construction issue is the delay in the ECT integration (as will be presented by Marzio Nessi), which has an impact on the overall installation completion (other issues remain the schedules for the ID and Muon end-cap chamber installations, and the calorimeter power supplies) Very major software, computing and physics preparation activities are underway as well, using the Worldwide LHC Computing Grid (WLCG) for distributed computing resources Commissioning and planning for the early physics phases have started strongly ATLAS is highly motivated, and on track, for first collisions in 2007 and finally LHC physics in 2008 (ATLAS expects to remain at the energy frontier of HEP for the next 10 15 years, and the Collaboration has already set in place a coherent organization to evaluate and plan for future upgrades in order to exploit future LHC machine high-luminosity upgrades) (Informal news on ATLAS is available in the ATLAS eNews letter at RRB ATLAS Progress Report, CERN-RRB rd October