90
BABAR Commissioning Gerard Bonneaud, Ecole Polytechnique BABAR Collaboration Week July 13fh - 17fh, 1998 I

BABAR Commissioning -  · BABAR Commissioning Gerard Bonneaud, ... - PEP FSM delivered to BaBar (injecting, beam aborted, ... f,o:r both hardware and

  • Upload
    vanminh

  • View
    238

  • Download
    1

Embed Size (px)

Citation preview

BABAR Commissioning

Gerard Bonneaud, Ecole Polytechnique

BABAR Collaboration Week

July 13fh - 17fh, 1998

I

Outline

1. BABAR Key dates (Bell Chart)

2. Commissioning Workshop

3. Integration

4. Cosmic Period and Post-Cosmic Activities

5. Conclusions

BABAR Key Dates

Installation and Initial Check-OutIndividual system pre-commissioning after installation(Bell Chart for subdetectors - Software Chart for Online)All subdetectors (except SVT) installed and initial check-outCompleted at the end of October ‘98.

Commissioning with Magnet-On - Cosmic PeriodNov. -Dec. ‘98.

Preparation for Beam Running - Roll-On PeriodFrom beg. of January ‘99 until end of March ‘99(installation of SVT into BABAR).

Beam DataFrom beg. of April ‘99.

I...~l..II I,,,,,’

,\Wll,l ( ‘5.1. SS’I’22.Jul.98

.-..-~

1)11<(~’ II,0

SL‘dC~lll Ikwt~0 3 . A u g . 9 8

I. IhWl~02.Oz.98

R27

w21.Jul.98-* r----lIwlillt I)lK(

24.Jul.98

- - - -

IIhl:Itt ( ‘; , I.~.j ( ‘:llih S!s.

I

S V I

l<C!ild~07.Jan-99

Commissioning WorkshopSLAC July Sth

1. Session#l: System Tests and Initial Check-Out-Vera Luth

The Initial Check-Out follows the Installation in IR-2. The aim of the sessionwas to reassess the readiness of the Individual Systems for data taking at theCompletion of the Initial Check-Out phase.

l Online - Neil Geddes. SVT - David Kirkby. DCH - Dave Coupall DRC - Dave Brown. EMC - Jo rdanNashl IFR - Nicola Cavallol T R G - Usha Mallik. Discussions

2. Session#2: Integration - Andy Lankford

Transition from System Tests and Initial Check-Out to Operation of theDetector / Experiment as a whole.

l Hardware - Walt Innesl Software

. Intro + Overview - Neil Geddes

. Detector Controls - Gerry Abrams

. Dataflow - Mike Huffer

. OEP - Gregory Dubois-Felsmann. Discussions

4

3. Session#3: Cosmics - Harvey Lynch

Operation of BABAR as a full detector of multiple systems working together.

. RECO -Bob Jacobsenl SVT - Brad Abbott. DCH - Gerhard Ravenl DRC -Guy Wormser. EMC - Helmut Marsiske. IFR - Nani Crosetti. TRG - UshaMall ik. Discussions

4. Sessionfi4: Post-Cosmic Activities - Tom Glanzman

During Roll-On but before Beam data taking.

l Roll-On -Bob Bell. Installation of the SVT - Leroy Kerth. Discussions

Follow-up of the Online Workshop of May 2Sth, with emphasis on

l Improve comprehension of system readiness for data taking atcompletion of initial check-out (improve flow of informationbetween systems on specific developments, e.g. GUI’s, and shareexperience);

l detail Online planning;

. address the crucial issue of Integration of the detector systems withthe Online system;

l inputs from all systems (Online included) for Oct. - Dec. period:help to define reasonable, achievable goals for the cosmic period;

. start to understand on how to use the Roll-On period for systemactivities.

(System = SVT, DCH, DRC, EMC, IFR, TRG, ONLINE)

Integration (A. Lankford)

Transition from system tests and initial checkout to Operation ofthe detector / experiment as a whole.

“. .detector systems will integrate core online software components with theirdetector -specific online code in the context of their single-subsystem detectorand electronics teststands.. .”

1. Installation of central / common electronics componentsSee Vera’s talk

. General Comment: infrastructure is being installed in IR2: albeit slowly,However generally in time for needs.

l Action item: electronic subsystem leaders should examine tables of dataAcquisition and fast control hardware allocation forConsistency with their subsystem plansO n t h e W e bElectronics+Common Electronics: “hardware needs and

Availability”

2. Online Integration

As each detector system is installed in IR2_ its data acquisition hardware andonline software will be integrated serially into the Online System as a whole.

l this integration into the Online System will demand some dedicated time :around one week per system;

l this integration should take place promptly after operation of the system’selectronics has been reestablished in order to identify integration problemsearly;

l after integration, the system can operate on its own by using partitioning;

. some of this integration may occur in parallel with other activities;

l however, integration of subsequent systems may temporarily disruptsystems that are already integrated.

8

Action items:

1. The installation and initial checkout schedule, which has largely been set bymechanical and electrical installation, needs to explicitly incorporate timefor integration.

On one side, the online system contacts must ascertain that their plans tointegrate core online software components with their detector-specific codeprovide for complete integration, before installation of their detectorsystem in IR2.

On the management side, activities that cover integration of detectorsystems into the Online System have to be incorporated into theinstallation and initial checkout schedule.

2. The “initial checkout”, or further developments, of Online Software mustbe incorporated into the schedule; human resources to performintegration of detector systems into the Online System must also beconsidered in redrafting the schedule.

3. From October onwards, systems are all parts of the Online System andresource conflicts will need to be set by the “run coordinator”.

4. The Online System itself will need high priority during this period inorder to complete its development.

The October Period

Completion of the initial checkout of the barrel calorimeter

Initial checkout of the DRC

Initial checkout of the Forward calorimeter

Cosmic data taking of the DCH without magnet

Continuation of the initial checkout of the IFR

Needs from the Online Core Software . . . . . .

* Miniplenary Session Thursday July 16’h, 9:00 am

Integration - Detector Control Perspectives(G. Abrams)

Core system has been designed to facilitate integration of detector-specificsubsystems.

+ significant preparation required of each subsystem before integration canproceed.

Miniplenary session Monday July 13’h ,4:30 pm(DRC detector control system)

Tiered Structure of ODCA

Time Homogeneous OnlineArchitecture

Common GUI vision

The (Receding) Dream

ODCin The

Integrated Online

Fully functionalGrab bag of tools, pack

misc. code, guis, . . .Stitched together

The Deliverable

ODCIntegrator’s

View

Basic monitor/controlStandalone systemsHardware devel.,

checkout

The Foundation

Common HardwareODC

Code Management veloper’ s

EPICS utilities, dots, etc. ViewGeneric Component Proxy

Infrastructurel PEP-II

- Data from PEP-II EPICS IOCs delivered to BaBar(Beam intensity, energy, lifetime) - done, April ‘98

- Data from PEP database delivered to BaBar(magnets, valves, etc) - done, July ‘98

- PEP FSM delivered to BaBar(injecting, beam aborted, etc) - done, July ‘98

- Noise monitoring from BaBar to PEP(for beam tuning) - due Sept. ‘98

- PEP - BaBar joint FSM(injection permissive - Injectable, etc) - due Sept. ‘98

a Magnet(this includes the BaBar solenoid, bucking coil, liquifier,cryogenic plant, magnet doors, power supplies, hall probes, _ , .)

- Conversion of the Ansaldo controls to BaBar standard- Access to the Allen-Bradley data via EPICS for monitoring- Incorporation of magnet alarms, warnings into BaBar

This work is in progress now, and will be completed this summer

Status of the Subsystemsl The SpectrumFrom

- Basic design still in progress

To- Final display features in last stages of development

Examples- Still developing engineering screens for hardware

checkout - production monitorin,o and control screensnot yet on the drawing board

- System design now complete; first system prototypenow under construction for July test

- Installation has commenced: EPICS database,screens, Sequencers under development

- System test has started; production monitoring &control being exercised

l EPICS readiness- 3.13 migration (still !) imminent

(hidden from application developers)- Hardware drivers (I/O) all operational- Component Proxy, Archiver ready for system users- Proxy ready for amalgamation with Run Control

Tasks for the Subsystemsl Complete their system hardware specification

- Still do not have all crates, modules specified- Not all electrical connections for monitoring (or Safety !) are

specified- Have all safety-related hardware identified, and readied for

earliest installation

l Complete their EPICS applications.- EPICS databases (control blocks)- Sequencers (finite state machines)- Screens (data display)- Prepare for the integration with Run Control (ORC)

(IOCs should prepare a Runnable flag if appropriate)Prepare data set(s) for downloading to fulfill the corzfigz~e

transition

l Specify their Components(each component is a UNIX process which owns a collection of EPICS

channels joined together for control via ORC and for archiving)- Write the cdev ddl to connect EPICS channels to the Proxies- Determine data transforms (for the Archiver)- Write the class to describe the transient data object- Determine readout rates, and archiving rates (for all channels)- Determine an overall Runnable flag for transmittal to ORC

What’s Left Out - Anything ?

e Ambient data handlingArchive- Data browser- Framework examples to study event and ambient data

correlations

Real-time processing (l/2 hour latency ?)- Component snapshot display-. Analysis processes (variability studies)- Quality control processes

l Data validationl Data comparison with standardsl Data library with standard distributions

l Setpoint data handling- Editing facility for construction of new data sets- Management facility for keying datasets

BaBar ControlsThe Controls Group consists of a core development team -

LBNL - G.Abrams, C.Lionberger, S.LewisSLAC - S.Allison, P.Anthony

working closely with the principal developers from thedetector systems -SVT - D.KirkbyDCH - M.MorandinDRC - G.VasileiadisEMC - T.MeyerIFR - P.Paolucci

(plus many, many others).

The BaBar Controls System is responsible for themonitoring and control of systems and devices installedin IR-2. Monitoring data is permanently recorded so thata detai:led history of data-taking conditions is available tocheck aga~inst the quality of the physics data.

Infrastructure support is provided to the BaBar SafetySystem, with early notification of off -no,rmal conditionswith alarms an,d warn:itngs, at-rd ha~rdware i#nterlocks tosuppo,rt fail-safe respo,nse Tao hazardous condiitions

Much of the Controls hardware and software is contributeddirectly by the detector sytems. The core group providescommon code and so.urce distrilbutio,n, as well as theidentification of standards, f,o:r both hardware andsoftware, to be obeyed by the collaboration.

Gerry Abramstatus of ControlsMay, 1998 1

Global View of BaBar Controls

Env. data

Systems on this sidewill operate even if therest of the Online is noactive.

Gerry Abramrs m u s o f comro$May, 1998 / 2

Controls Hardware

Monitoring Hardware -Sensors and actuatorsSignal conditioningSignal digitizationSignal transport (busses, networks)Signal readout in VME :

Standard boards -SIAM for interlocksVSAM for digitizationGM6 for remote readout and transport via CAN bus

Computers -Single Board Computers in VME (real-time OS VxWorks)Control consoles ( UNIX workstations)Boot server

Mass storage for controls data

Note : this hardware is provided by a variety of sources,notably from the detector systems and from SCS. Thekey task for the Controls Group is to mold thishardware into a cohesive, unified system.

Gerry Abramsstatus of ControlsMay, 199.5

Controls Software

EPICS - Experimental Physics & Industrial Control SystemDevice drivers and other device supportControl blocks (the EPICS “database”) to monitor and

control devicesScreen building for control panels and data

visualizationSequencers to implement finite state machinesAlarm handler to issue audible and visual alarms and

warnings

Component ProxyInterface to the VME SBC for data readout,

configuration data download, response to RunControl. The proxies provide the gateway withinthe UNIX environment to monitor and controldetector systems and devices.

Data Storage and RetrievalTransient data objects - created from EPICS reads

(normal flow) and from persistent data stored inOBJY (download, as for reconfiguring)

Data display - browsersData analysis - histgogrammer, etc

Gerry Abramsstatus Of ControlsMay, 1998 4

Additional Software (mostly)

Hardware database -ORACLE database for BaBar hardware supports maintenance

of the detector :web browser for inventory controlrepair log

Occurrence handlerincorporation of the message handler cmlog from CEBAF;

maintenance of the server

Operational SafetyDisplays and operator notification of the status of the saftety

systemhardware Summary Alarm Panel (a part of the safety

system itself)summary display for hardware interlocks (including

defeated channels) (EPICS)EPICS Alarm Handler - audible and visible alarms and

warnings, alarm tree

Gerry &ramsstatus Of ControlsMay, 1998 5

External Dependencies for l3al3ar Controls

Detector Systems ProvideSensors and signal conditioningReadout into VMEApplication development in EPICS (screens, etc.)Connection to EPICS Alarm HandlerData stream to/from Objectivity database for

environmental data storage, configuration dataDatabase Developers Provide

OBJY - classes for storage and retrievalmanage:me.nt of stored data in modes e,fficient

fo’r retrievalmanagement of configuration data (i,ncl. keys

and b.rowsers to select and save configs)ORACLE - setup, forms, browsers

Online System ProvidesAccess to environme,ntal and configuration data :

browsers, APls, tra.nsport mechanisms for distributedobjects, GUI to faci-fitate changes in configurationdata (forms interface to the config db)

Data display and analysis toolsRun Control commands for coordination with the Online

finite state machine

Gerry Abramsstatus of ControlsMay, 1998 6

Integration - Odf Perspectives(M. Huffer)

The IIU platform will be built incrementally as systems arrive with theircrates and electronics.

Systems should integrate with the IFU platform as early as possible in order tostart gaining experience living in the final environment.

Miniplenary Session Monday July 13’h, 4:30 pm(DCH Integration in LAB)

A separate Odf core development platform will exist in the central lab inorder to minimize the disruption of system access to subsystems of the IR2platform.

12

Integration - Oep Perspectives(G. Dubois-F.)

Integration involves establishing software compatibility.. .Consequentlyintegration of systems with Oep is already underway.

Integration with Oep will evolve from basic single-subsystem teststands tosingle-subsystem teststands with full Oep support, and then to integration into“quasi-production” large scale executables.

Multinode farm operation will be transparent to systems

Miniplenary Session Tuesday July 14’h, 4:00 pm.

Integration - Conclusions

l Online System has been designed for easy integration;

. However, preparation of the detector subsystems via theirteststands is essential;

. Availability of core online software was a problem in the past,but is not a problem now. Nonetheless, the core online code iscumbersome at present.

Summary of Cosmic Ray Run Sessionof the Commissioning Workshop

8 July 1998Harvey Lynch

Introduction

8 Get WHOLE detector (hardware + software) working inpreparation for running with e+e- collisions

* Initial Checkout and Cosmic Ray Run are 2 parts of thecontinuous Commissioning Process

* Optimal Model = no technical constraints- How many cosmic rays do systems want to log?- Why?- Running conditions (trigger, environment)?

* Minimal Model = agree to move onto beamline- Same sub-questions as Optimal Model- How is minimum determined?

Comm. Workshop/Cosmic Ray Run

Reconstruction Bob Jacobsen

* Prep for Cosmic Run using MDC II sample- Reconstruct cosmic ray data to level to understand

detector during run- Demonstrate integration with databases, online event

processing, and Prompt Reconstruction

* List of tasks, e.g.- Full reco storage- Confirm calibrations with data- Simulated Digis for special commissioning hardware- Improve CPU and memory performance

* Asked for feedback from systems- Big item is Graphics- Need more input- Will offer help

* Need planning- Try to anticipate reasonable needs- Be able to respond to change- Do not try for every possibility

Comm. WorkbopiCosmic Ray Run

* Run to find and fix problems- Schedule run time as if real- Keep record of problems, but don’t stop unless necessary- Fix problems from list in different period

* Queried systems on data taking needs- Consensus = 5 days- Law of Least Amazement says 2 - 10 days

Comm. Workshop/Cosmic Ray Run 3 Hxve!~ Lynch Collab. Meet July 1998

Preparation for 1998 Cosmic Commissioning

This phase includes both fuurthcr use of the MIX II sample, particularly togaiu experience with large event samples and the event store. We also haveto prepare for the fall cosmic running, and comics take11 earlier by somesystems. Chronologically, this is from June 1998 through November 1998.

Priorities are:l Prepare to rcconstn~ct cosmic data to the level needed to understand thedetector during the run. Its important that we get as much progress aspossible during the ~‘LUI, and that we record good data for study during theroll-in period.

l Demonstrating integration with:Event and conditions databases, parlicularly I‘or processing real data and using

online calibrations & conditionsOnline event processing

Prompt Reconstruction

These demonstrations should take place before cosmic running, so that thesame operations call be tested in realtime with the data. The cosmic run isour only chance to do this before real data arrives.

Reco at 7198 Commissioning Workshop Bob Jacobsen 7 July, 1998

Start of cosmic preparation task list

Much of this is based on earlier task lists for MDC2, etc. See web &individual system contacts for details

l Full reconstruction storage to the event store, including demonstratingreprocessing (full and partial)

l Confirming as many calibrations as we can with cosmic data (What arethese? We need to make a list by system)

l Demonstrating whatever PID distributions we canl Develop selection modules for cosmic subsamplesl Selecting good samples for various studiesl TO processing and checkingl Have processing and reprocessing jobs ready for cosmic data*Prove out the prompt reco connection, including logging. This includes

having each system implementing a “prompt reco calibration” using standardclasses, etc.

Reco at 7/98 Commissioning Workshop Bob Jacobsen 7 July, 1998

Start of cosmic preparation task list continued

l Have simulated digis for any special commissioning hardware, includhgscintillalo~~s, clc.

l Crcalc sclectioii filters for “good” comics of various forms.l “Detector OK” modules using conditions datal Standard jobs for “IFR straight tracks”, “DCI-I straight tracks”, “field on

tracking”, “1‘~dl tracking wit.11 trigger inputs”. Support and update analysis tools that will be needecl during the cosmic ~LUI,

including getting documentation up to datel Improve CPU and memory performance to the level needed for the expectedcosmic data sample size

l Add event-store compression to the digi (and perhaps GHits) so that we arestoring “Final” format.

Many of these are open-ended projects

l Will need to improve for real datal But we need complete coverage for cosrnics before we move on

Reco at 7/98 Commissioning Workshop Bob Jacobsen 7 July, 1998

Needed now - Feedback

Graphicsl What do people want?*Not many digis visible (instead of hits)l Will help people draw what they want to see

Reco at 7/98 Commissioning Workshop Bob Jacobsen 7 July, 1998

The Role of Planning

Planning is preparation for coping with risk in reaching goalsl You make sure you have all the pieces in place

“What do WC want to have coml~leted?”“Is it there yet?”

l And you think about how to handle likely problems“Lets lmlme to run X and Y independently”“What do we need to live without Z?”“We don’t need a wooden stake & homer in the control room”

But the problems still arise, so expect a phase change in your plans

Cop14right 3 1395 IJni ted Feature Syndicate, IncRedist r ibut inn in whole OI’ in par t prohib i ted

Reco at 7/98 Commissioning Workshop ,Bob Jacobsen 7 July, 1998

Test/debug is a dlifferent phase

You have to run it to fiud the problemsl Running is usually the best way to find integration problemsl Particularly if you’re running “for real”

Runniug is almost uever the best way to fix problemsl People need time to think, experiment, negotiate, fixl It’s a mistake to skip that time

Need to keep the purpose clear at each poiutl Run to find problemsl Bypass those problems temporarily to find other problemsl Work separately to fix problems in parallel

‘Whiteboard model’ of counnissiouingl At a scheduled time, start the system & try to run for reall Every encountered problem gets written on a whiteboard, then bypassedl When you just can’t get any further, turn the list over to the experts and

schedule the next attemptl Problems stay on the whiteboard until running demonstrates they are gone

Oilly attempts are scheduled in advance, uot activities

Reco at 7/98 Commissioning Workshop Bob Jacobsen 7 July, 1998

Why the systems want events from the cosmic run

DCH wants lots of comics for dE/dxl Justification: needs field on

IFR wants lots of comics for studies of multil~licity,~trackillgl Justification: needs field on

EMC wants lots of comics for studies of energy resolution

l Yes, you guessed itl Plus needs good tracking

Bottom line: Want to have data to study during roll-onl Less we have, the longer the startup takesl Ueing more quantitative is hard

Getting 5 days worth seems to be the concensus

l Probably worth delaying the roll-on if its “just a couple more days”

Reco at 7/98 Commissioning Workshop Bob Jacobsen 7 July, 1998

Copyright 3 1898 IJni tad Feature Syndicate, IncRedistribution in whole ot- in port prohibited

Reco at 7/98 Commissioning Workshop Bob Jacobsen 7 July, 1998

STAFFED THAT THEPROJECT 15 51X

WEfXS BEHIND

THE ANA L YSTS. .

Dilbert by Scott Adams From the ClariNet electronic newspaper Redistribution prohibited [email protected]

Reco at 7/98 Commissioning Workshop ,Bob Jacobsen 7 July, 1998

Copyright 3 1997 IJni ted Feature Syndicate, IncRedistkibution in whole cw in part prohibited

Reco at 7/90 Commissioning Workshop Bob Jacobsen 7 July, 1998

HOW LONG WILLITTAKE 7-O FIX ANYPROM325 WE FZNDTN OURBETA

Copyright 3 1935 IJni ted Featwe Syndicate, Inc.Redistributinn in lwhnle nr iv, put-t pr-ohibi ted

Reco at 7/98 Commissioning Workshop Bob Jacobsen 7 July, 1998

.SVT Brad Abbott

* Work will be at LBL until Jan 99

* Minimal Goals- Can do physics (noise, gain, dead channels etc.)- Safety interlocks and monitoring- Electronics can read out detector- Online can write to disk- 100 urn placement of Si modules

* Optimal Goals- Full calibration- Find tracks in all 5 layers- Measure resolution- Detailed study of ATOM chips- Need lE5 to lE6 events = 1-12 days

* Issues- Lost person doing the internal alignment software- Need type C modules for all layers by beginning of August- As built drawings of support by mid July- Yield of modules, ATOM chips, HDIs?

Comm Workshop/Cosmic Ray Run

Goals for SVT cosmic rayCommissioning

Brad Abbott July 8

- Engineering and Physics

- Two models: Minimal and Optimal

- What is needed to give pen light forinstallation

- Performed in a clean room at LBNL soindependent of other detectors.

Minimal Model Goals:

Can we do physics with SVT in BaBar?Are noise, gain, offset dispersion, # dead/noisychannels, etc. acceptable.

Are the safety interlocks and monitoring sys-tems operating properly.(Radiation and position monitoring must betested at IR-2)

Are the final electronics able to read out the de-tector. Can we see hits in the silicon from cos-mic rays?

Can the Online system succesfully write thenecessary data to disk?

100 urn placement of silicon modules

Is the system stable?

Are all cables properly installed

List of dead/noisy channels. Database of errorsand problems.

Detailed instructions for mounting SVT at IR-2

Lists of experts for each subsystem

Checklist

Minimum model

We can use internal charge injection -> noise,gain, dead channels, offset dispersion etc. Ver-ify cable connections. Database of errors.

Before the detector is connected, the interlockscan be tested. (Fluid, Voltage, Temperature)

Data and event display will allow study of sili-con hits

CMM to measure location of modules to within1OOum

3 week test in December to measure stability ofsystem

Optimal Model Goals

Full Calibration of detector. Threshold scans->Chip Thresholds->cosrnic data -> efficiency vsoccupancy-> Chip Thresholds -> . . .Optimize system

For those cosmic rays with pass through all 5layers: find tracks, measure resolution

Align detector: Exercise alignment softwareand verify silicon module location.

Detailed studies of ATOM chips. TOT behaviorTiming issues. Time to calibrate SVT. Effectsof # channels calibrated at same time

Time dependency of calibration.

Optimal Model continued

To measure resolution need high momentumtracks.X2(multiple sca ering)=X2(resolution)tt at 700MeV. x - l/p2

Tracking code operational with real detectorgeometry.

Histograms of occupancy, track location. TOT,time stamp, data size, data rate etc.

Alignment software running and numeroustracks

# of cosmic ray events needed

- 150,000 channels

Full alignment and hits in all channels:100,000 tracks minimum and 1 ,OOO,OOO better

Cosmic ray telescope has been built2 paddles above SVT (20 x 50 cm each)40 cm of steel -500 MeV minimum momentum2 paddles below SVT (24 x 50 cm each)

Data rates: 1 Hz -> 100,000 tracks in 28 hoursof running. l,OOO,OOO tracks in 12 full days ofrunning

Note only - 25% of tracks through all 5 layers.

Milestones

Complete readout chain in beginning of Au-gust. Readout two modules. Allow debuggingof hardware, software, all necessary compo-nents available, understand system.

Beginning of October. Layers l-2 installed onsupport cones. 1 week test using cosmic rays.Write out data to disk. Gain experience withsystem. Look at occupancy, noise etc.

Beginning of December. Detector completed.3 week test to fully test system.

Issues

Code for internal alignment of SVT

Type C modules for all layers by beginning ofAugust.

As built drawing of final support structure byMid July

Silicon Modules / ATOM chip yields / HDIs

Drift Chamber Gerhard Raven

* Needs (min.? opt.?)- Time to distance calibration- to for each cell

. 0.4E6 tracks- Cell map

. lE6 to map drift distance, entrance angle, z

. 0.5E6 to map HV, Threshold, Gas mixture- Do not need horiz. cosmics; only need layer info.

* Computing Resources- 200 CPU hours per iteration on lE6 tracks (0.7s

se&rack!)- Several iterations

Comm. Workshop/Cosmic Ray Run

I-3i 9 8 / ‘0 6 , ‘2 9 2 2 . 4 7

run= 1 6 7 8 - 3

3--L 77

- -

0 0.1 0 . 2 0 . 3 0 . 4 0 . 5 0 . 6 0 . 7 0 . 8 0 . 9 1r e s o l ( c m j z i n i r + e x i

res “S deco

_

/,,,I,,\\ I I , ,0 0.1 0 . 2 0 . 3 0 . 4 0 . 5 0 . 6 0 . 7 0 . 8 0 . 9 1

r e s o l ( c m ) z i n i rr e s vs d o c c

run1690event407/03/98loo,, / / , / 1 , /p , /, / /, I, ,

-‘““-loo -80 -60 -40 -20 0 20 40 60 80 100

L-j

9 8 / 0 6 / 2 9 2 2 . 4 7

run=1 6 7 8 - 35 0

4 0

3 0

2 0

1 0

0

2 2 5

2 0 0

1 7 5

1 5 0

1 2 5

i -cc

7 5

5 0

2 5

0

0 20 40 60 80 1 0 0

8 0

6 0

4 0

2 0

0

0 0.1 0.2 0.3 0.4 0. 5 0.5 0 . 7 0 . 8 0.9 1D r a b

2 6 0 0

4 0 0

2 0 0

0

- 3 - 9 8 / 0 6 / 2 9 2 2 . 4 7

0 0.5 1

2 6 0 0

4 0 0

2 0 0

00 0.5 1

:6 0 0

4 0 0

2 0 0

00 0.5 i

Y;

c P if 6 0 0ml 0 0 0

=4 0 0 7 5 04 0 0 5 0 0

2 0 0 2 0 0 2 5 00 0 0

0 0 . 5 1 0 0 . 5 1 0 0 . 5 1 0 0 . 5 1

- 2 - 9 8 / 0 6 / 2 9 2 2 . 4 6

4 0 0

2 0 0

0

6 0 0 if

4 0 0

2 0 0

00 0 . 5 1

run=1 6 7 8 - 36 0 0

4 0 0

2 0 0

n- 0 0.5 I

t i m e v s docZ?Lt)

0 0.5 1

t i m e v s docE?fii)

F400

2 0 0

n“ 0 0.5 1

c mtime vs doca (Ti t )

o 6 0 0c

4 0 0

2 0 0

n

=0 0.5 1

6 0 0 F4 0 0

2 0 0

00 0 . 5 1

C 0.5 I

t i m e v s docET!it)

“ 0 0.5 1c m

t i m e v s d o c a ( i i i )

“ 0 0.5 1

t i m e v s docZ;(li’ii)

6 0 0

4 0 0

2 0 0

00 0.5 1

6 0 0

4 0 0

2 0 0

00 0.5 1

- l - 9 8 / 0 6 / 2 9 2 2 . 4 5

run=1 6 7 8 - 3

0 0 . 5 1 0 0 . 5 1 0 0 . 5 1 0 0 . 5 1

t i m e v s docEY$it) t i m e v s docEri)iit) t i m e v s docE?fit)

[r,c : 6 0 0 ILc

4 0 0 4 0 0

2 0 0 2 0 0

00

00 . 5 I 0 0 . 5 1 0 0 . 5 i

t i m e v s docE?fii) t i m e v s docEYf?it) t i m e v s docEY$it)

0 0.5 I 0 0.5 1 0 0.5 1

t i m e v s dock?fii) t i m e v s doccC?fit) t i m e v s docE?ffit)

c mt ime vs docc ( f i t ) t i m e v s docErij;ii) t i m e v s docE?fit)

t i m e v s docE?fit)

6 0 0

4 0 0

2 0 0

0

t i m e v s docZ?fii)

6 0 0

4 0 0

2 0 0

0

t i m e v s docoC?fit)

4 0 0

2 0 0

00 0.5 1

t i m e v s docE?fit)

DIRC Guy Wormser

* Most tests will be done in lab before installation

* Primary goal = determine number of photo electrons pertrack- Transmission along bar- Uniformity of ring- Number of photons outside ring

4: Want 2.5E6 tracks- Rate = 5 Hz => 5 days

* Problem: Probably will not have many bar boxes duringcosmic ray run

Comm. Workshop/Cosmic Ray Run

I The MAIN goal is to have an early look atthe number of photons seen per trackI Transmission along the barsI Uniformity along a ringI Number of photons seen outside the ring

I DIRC reconstruction untouched comparedto normal events

I Muon spectrum above 2 GeV -> all trackswill have saturated Cerenkov angle

I Poorer angular resolution from DC (noSVT but larger level arm)

I Ideally ~5000 tracks per bar (10 z pointsat 500 each) in several ( 0,~) bins: 50k

I 50 bars easily reachable ~2.5 M'eventsI Assuming 5 Hz ' good tracks 'rate, 5 Days!I More stringent requirements

I Alignment, bars defectsI How totestside bars

I DIRC will measure dip angle with betterprecision than Drift chamber: study driftchamber systematic effects

I TO studies with DC, IFRI DIRC-Calorimeter correlations on

showers: Converted pairs, backsplash

Calorimeter Helmut Marsiske

* minimal model not considered

* Objectives- Exercise all aspects of Hardware and Software- Cable map- Sparsification interconnect- Verify trigger

* Calibration- Source 1 - 2 x per week- Light pulser “continuously”

* Diagnostics 2 - 3 hours/day

* Cosmic rays- Energy calibration (200 MeV Min-I) => 0.5E6 tracks- Alignment to DC

Comm. Workshop/Cosmic Ray Run

EMC cosmic Commissioning

The Cosmic ray run represents to first fullsystem test of the EMC

l Engineering Objectivesl Exercise all aspects of Ha;rdware and

Softwarel Front Endl Dataflowl OEPl (Prompt) Recol Objectivityl Integration and Partitioning

l Cable mapl First chance to verify cable/xtal map is correct

l S~pa~rsification interconnectl check that the algorithm which lets ROMS share

information about clusters operates

l Verify the connection to the trigger,and the operation of the trigger

l Trigger threshold operation properl Trigger energy calibration correct

H . M arsiske, SLACBaBar EM C Commissioning J.N ash, Imperial C allege

Physics Objectives

Cosmic running will provide an opportunity tomake some rough checks

l Reconstructionl Clusters and Bumpsl Track-Cluster Mapl Particle ID

l Cali brationl Single crystal energy calibration

l 200 Mev from MIPl Compare to 6 MeV source calibration

l Alignmentl Overall alignment to DCl Initial attempt at Xtal alignment

H . M arsiske, SLACBaBar EM C Commissioning ,lN ;rrh Tm nwial C nllmo

Running Strategy

EMC would like to take a lot of data, butreserve sufficient time to perform diagnostics

l Source Calibrationl Runs to establish inital calibrationl one or two long runs per week

l Light Pulserl continuous operation is possiblel exercise system, check stability

l Electronics/Diagnosticsl Dedicated time each day for

calibration, developmentl request 2-3 hours per day out of

normal data takingl Cosmics

l For Energy calibration we needl Vertical and Horizontal Cosmicsl Close to IPl Tracked by DCH, iRPC, and IFRl trigger independent of EMT

l Gives energy calibration to c 1%l about 200 single crystal cosmics per crystal

H. M arsiske, SLACBaBar EM C Commissioning J.N ash, Imperial C allege

EFR Nanni Crosetti

* November not special, except for integration with rest ofBaBar and magnetic field

* July- ODF-OEP link- Feature extraction, output TC- 1 ROM

* August- Integrate Cyl. RPC- Integrate detector control- TC+Digi-tTC

* “November”-2+2ROMs- Measure RPC efficiency- Alignment- Data storage- Read/Write to database- lE6 events

&mm. Workshop/Cosmic Ray Run Harvey Lynch Collilb. Meet July 199s

Trigger Usha Malik

* Continue lab tests

* Interact with multiple systems

* Ll rate 80 Hz (DCH) 500 Hz (EMC)

* Test, refine L3- Classify tracks- Check/verify online and offline

Comm. Workshop/Cosmic Ray Run Have), L!nd~ Collab. Meer July 1998

Conclusions

* Not much difference between “Optimal” and “Minimal”

* 5 +5 -3 days of good running is agreeable to all

* BUT do not put off that running until the very end!

* Cosmic run will extend into December, but consider that asfloat, not for planning

Comm. Workshop/Cosmic Ray Run Harwy l.*nch Coliab. Meet July 1998

Post-Cosmic Activities(January to March 1999)

Tom’s transparency

SVT transparencies

+ Some (prel iminary) conclus ions . .

l Need to optimize the DRC bar box installation;l DCH cables versus SVT rafts!l Minimize the electronics house and detector downtime;l Magnet-off on Jan. 4** Magnet-on not before end of March (around 26*)

+ More will be discussed on Bob Bell’s dedicated sessionWednesday, July 15fh.

Session 4: Posr Cosmic Activities(the period Januaiy-March 1999)

1. BABAR prepares for and rolls cmlo the beam line

2. The SVT appears and integrates itself (both mechanical andsofware) with BABAR2.5 6,pleir D t R t ~-S+attcho-3. Recovery from the Cosmic Ray run

a. derecmr subsystem mechanical & sofware “tuning”b. online software fixes discovered during Cosmic run

4. Conrinued software development on many frontsa. Core Online software (ODF, ORC. ODC. OEP, OPR)b. Subsyshzm software (CAL, ROM, L3, RECO)c. Install/rest “facrory mode” operations

(automation, diagnosis, operaror screens)

5. Commissioning rest of online compure farm

6. Lots of other miscellaneous activities ***GUARANTEED***

What al-e the issues?

I. A tight schedule with an important (unmovable) goal

2. Comperikx for common resources (workstations, the EH, space to work)

3. Comperirion for detector subsystems (calibrations vs. ODF development)

4. Compe~ilion for experts (who will be asked 10 consult. but have thei]

Installation of the SVT

Commissioning Workshop

July 8,199s

Leroy Kerth

SVT Installation Scenario

In December 1998 the SVT will be in operationat LBNL with:

- Final MultiplexersBackward MUX in final racks

- Final Power Supplies

- Final Monitoring System

- Final Cables (except for power cables andoptical fiber)

- The final power cables and optical fiber fromthe roof of the electronics house to theplatforms in place.

-Mounting Hardware for Multiplexers Racksin Place in IR2

- AC power, Building air, Dry Nitrogen inplace. SLAC

SVT Installation Scenario (cont.)

In January 1999:

The support tube is removed anddisassembled.

The SVT is transported to SLAC andinstalled on the Bl magnets

The support tube is reinstalled in II22

During this period the electronics, coolingand remaining cables are installedin II32

Clean Room Activity. After disassembly and removal ofthe commissioning detectors.

Install cooling lines, cables matching card supp,ortson the Ql - Bl assembly. -6 shifts -SVT

Install SVT on Bls - 2 shifts - SVT

Make cable and utilities connections. Completeelectrostatic shield -6 Shifts -SVT

Test Detector -6 Shifts - SVT

The following activities need to be scheduled in Jan. toApril 99 while the support tube is out.

Activity Not at IR2

Install cables and cooling lines in raftsSVT manpower - 2 shifts

Need access to rafts where they are stored.

Activity at IR2

Installation of the Rear Multiplexer RacksMinor Crane Work. -SLAC & SVT l/2 shift

Install Special Multiplexer Racks for Front andInstall Multiplexer Bins in Racks. -2 shiftsMinor Crane Work. -SLAC & SVT

Install Power Supplies in Racks on Roof ofElectronics House 1 shiftMinor Crane Work. -SLAC & SVT

Connect Power Cables and Test -SVT -2 shifts

Install cables from Multiplexers (platforms) toforward end:

SVT manpower - 2 shiftsneed access to forward end with forwardcalorimeter in place.

Activity at IRZ (cont.)

Install cooling system racks (two) - 4 shiftsCrane lifts (nominal) SLAC & SVTConnections and check out - SW

After Support Tube and Rafts in Place:

Cable and water connections forward endCalorimeter in placeNeed exclusive accessSVT manpower - 4 shifts

Cable and water connection backward endMust be scheduled before DC cablesExclusive accessSVT manpower - 4 shifts.

Conclusions

1. Integration process is now considered as a crucial piece to buildthe detector as a whole

* corresponding activities will take time and manpower!

2. Core online software group has done and is doing anincredible effort these last few months - their needs in termsof time to complete the BaBar Online System have to beexplicitly incorporated into our schedules.

3. Detector-specific online code developments have to get the firstpriority in detector efforts right now!

4. Cross-system experience and developments have to be shared;some subdetector specific functionalities have to be addressed ina coherent and solidar way (e.g. GUI’s, histograms - coreonline facilities already exist and ready to be exercised);adequate solutions (for the time being if final one’s not yetavailable) have to be made known to all systems.

5. Cosmic period will be “shared” between integration activitiesand cosmic data taking.

6. Cosmic data taking should not be delayed until the last day ofthe period!

7. There are clear needs for real cosmic data before end of Dec.‘98 - and that, in anycase, in addition to the crucialdemonstration of our ability to run and to operate (establish“normal run procedure”) BaBar as a whole detector!

Commissioning Core Group

G. BonneaudH. LynchW. InnesG. OxobyT . G l a n z m a n

T. SchalkB. JacobsenU. NauenbergK. Moffeit

-tex-oficio D. Hitlin, J. Dorfan, V. Luth

Subsystem contacts for commissioning activities

SVT B. Abbott

DCH D. Coupal, D. MacFarlane

DRC D. Brown, G. Hamel De Monchenault, G. Wormser

EMC H. Marsiske, J. Nash

IFR N. Crosetti. Ch. Sciacca

TRG L. Gladney, F. Kral, U. Mallik

Commissioners for Central Functions

Central Electronics A. Lanvord, W. Innes, G. Hailer

Computing Coordinator T Schalk

Online Technical Coordinator Ch. Day

Run Control N. Geddes

Dataflow M Huffer

OEP G. Dubois-Felsmann

Prompt Reconstruction T. Glanzman

Environmental Control and Monitoring G. Abrams

Calibration D. Brown

Reconstruction B. Jacobsen

Databases D. Quarrie

Simulation B. Lockman