Upload
diana-freeman
View
217
Download
0
Embed Size (px)
Citation preview
The ALICE DAQ:Current Status and Future
Challenges
P. VANDE VYVRE CERN-EP/AID
The ALICE DAQ:Current Status and Future
Challenges
P. VANDE VYVRE CERN-EP/AID
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 2
ALICE DAQALICE DAQ
• The original and updated requirements– The original requirements– The updated requirements:
higher multiplicity, addition of detector• Future challenges
– The Region-Of-Interest readout– Online filtering– Enhanced data compression– The new architecture
• Current prototyping status– The ALICE DATE– Data transfer, Sub-event building and event building– Mass Storage System and Permanent Data Storage– The ALICE Data Challenge
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 3
ALICE DAQALICE DAQ
• The original and updated requirements– The original requirements– The updated requirements:
higher multiplicity, addition of detector• Future challenges
– The Region-Of-Interest readout– Online filtering– Enhanced data compression– The new architecture
• Current prototyping status– The ALICE DATE– Data transfer, Sub-event building and event building– Mass Storage System and Permanent Data Storage– The ALICE Data Challenge
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 4
Original requirements: event size
Original requirements: event size
Detector Min. Event Size Max Event SizeInner Tracking System ITS Pixel 0.14 0.28
ITS Drift 1.50 1.50ITS Strips 0.16 0.16
Time Project Chamber TPC 30.00 36.00Time-Of-Flight TOF 0.18 0.18Photon Spectrometer PHOS 0.02 0.02High Momentum Particle Identification HMPID 0.12 0.12Dimuon Forward Spectrometer MUON 0.15 0.15Photon Multiplicity Detector PMD 0.03 0.12Trigger System TRG 0.12 0.12
Total 32.42 38.65
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 5
Updated requirements: event size
Updated requirements: event size
Detector Min. Event Size Max Event SizeInner Tracking System ITS Pixel 0.14 0.28
ITS Drift 1.50 1.50ITS Strips 0.16 0.16
Time Project Chamber TPC 56.00 75.90Transition Radiation Detector TRD 8.00 8.00Time-Of-Flight TOF 0.18 0.18Photon Spectrometer PHOS 0.02 0.02High Momentum Particle Identification HMPID 0.12 0.12Dimuon Forward Spectrometer MUON 0.15 0.15Photon Multiplicity Detector PMD 0.03 0.12Trigger System TRG 0.12 0.12
Total 66.42 86.55
Higher multiplicity: increased TPC event size Transition Radiation Detector (TRD) added to ALICE
Higher multiplicity: increased TPC event size Transition Radiation Detector (TRD) added to ALICE
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 6
Original requirements: data throughput
Original requirements: data throughput
Physics Trigger Detectors Max Event Event DataSize Rate Throughput
(MBytes) (Event/s.) (MBytes/s)Hadronic Central All 87 2
Min. Bias All 22 2Charm Central (C) All 39 40 1560
Min. Bias (MB) All 10 40 400Dimuon C+Dimuon Pixel, Muon,PMD 0.6 1000 600
PHOS,TRGCentral Pixel, Muon,PMD 39 40
PHOS,TRGTotal 2560
Conservative data compression to reduce the data throughput to 1.25 GBytes/s.
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 7
Updated requirements: data throughput
Updated requirements: data throughput
Physics Trigger Detectors Max Event Event DataSize Rate Throughput
(MBytes) (Event/s.) (MBytes/s)Hadronic Central All 87 2
Min. Bias All 22 2Charm Central (C) All 87 20 1740
Min. Bias (MB) All 22 20 440Dielectron C+Dielectron All 87 200 17400
MB+Dielectron All 22 200 4400Dimuon C+Dimuon Pixel, Muon,PMD 0.6 1000 600
PHOS,TRGCentral Pixel, Muon,PMD 87 20
PHOS,TRGTotal 24580
Conservative data compression and event rate reduction insufficientThe TRD allows new types of online processing
Conservative data compression and event rate reduction insufficientThe TRD allows new types of online processing
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 8
ALICE DAQALICE DAQ
• The original and updated requirements– The original requirements– The updated requirements:
higher multiplicity, addition of detector• Future challenges
– The Region-Of-Interest readout– Online filtering– Enhanced data compression– The new architecture
• Current prototyping status– The ALICE DATE– Data transfer, Sub-event building and event building– Mass Storage System and Permanent Data Storage– The ALICE Data Challenge
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 9
Future Challenges 1
Future Challenges 1
• For dielectron events– Region-Of-Interest identified by the TRD– Could be used for
– Region-Of-Interest readout
Electron tracks in the TPC and TRD detectors
Target: Reduce the event size from 80 to 5 MBytes– Online Filtering
Refine dielectron L1 trigger by a software filter
Target: Reduce the event rate from 200 to 20 Hz
• Requires limited CPU power.
Current estimate done with STAR data: 40 kCU• Physics simulation and DAQ prototyping are starting
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 10
Future Challenges 2
Future Challenges 2
• For central and min. bias events
Enhanced data compression for the TPC data
Data compressed by applying to the raw data the following conversion:– Clusters finder– Local tracking– Raw data converted into:
• Parameters of a local track model• Distances of the raw data clusters with the local track model
• Requires massive CPU power.
Current estimate done with STAR data: 400 kCU
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 11
Updated requirements: data throughput
Updated requirements: data throughput
Physics Trigger Detectors Max Event Event DataSize Rate Throughput
(MBytes) (Event/s.) (MBytes/s)Hadronic Central All 87 2
Min. Bias All 22 2Charm Central (C) All 87 20 1740
Min. Bias (MB) All 22 20 440Dielectron C+Dielectron Partial TPC/TRD 4.6 20 92
MB+Dielectron Partial TPC/TRD 1.2 20 24Dimuon C+Dimuon Pixel, Muon,PMD 0.6 1000 600
PHOS,TRGCentral Pixel, Muon,PMD 87 20
PHOS,TRGTotal 2896
Partial readout for dielectron triggersOnline filtering
Partial readout for dielectron triggersOnline filtering
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 12
Architecture upgrade
Architecture upgrade
FEE: Front-End ElectronicDDL: Detector Data LinkRORC: Read-Out Receiver CardFEDC: Front-End Digital Crate/ComputerEBL: Event Building LinkLDC: Local Data ConcentratorGDC: Global Data CollectorEDM: Event Destination ManagerTDL: Trigger Distribution LinkFCL: Flow Control LinkPDS: Permanent Data StorageSTL: Storage Link
Nov-99
PDS
Switch EDM
Time Projection Chamber
InnerTrackingSystem
PhotonSpectrometer
2.5-5 GBytes/sec. Pb-Pb run500 MBytes/sec. p-p run
20 Hz central + 20 Min. Bias +1000 Hz dimuon + 200 Hz diel. Pb-Pb500 Hz p-p5.5 - 100 us
FEDC
FEE
LDC
ParticleIdentification
1300 Hz Pb-Pb1200 Hz p-p1.2 us
1250 MBytes/sec. Pb-Pb run 100 MBytes/sec. pp run
DDL
EBL
1100 Hz Pb-Pb1000 Hz p-p5.5 us
Switch
STL
TriggerData
Trigger DecisionsDetector busy
FEEFEEFEEFEE
PDS PDS PDS
Trigger Detectors: - Micro Channel Plate- Zero-Degree Calorimeters- Muon Trigger Chambers- Transition Radiation Detector
MuonTracking
Chambers
FEDC FEDC FEDC FEDC
FCL
L0 Trigger
FEE
Interaction rate
8 103 Hz Pb-Pb
105 Hz p-p
RORCRORC
RORC
LDCLDCLDCLDCLDC
L3 GlobalTrigger
FEDC
RORCRORCRORC
RORCRORCRORC
RORCRORCRORC
FEEFEE
RORCRORC
RORC
L3 Filter& Partial
RORCRORC
RORC
L3 Filter& Partial
L1 Trigger
L2 Trigger
GDC
L3
L3
L3
L3
GDC
L3
L3
L3
L3
GDC
L3
L3
L3
L3
GDC
L3
L3
L3
L3
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 13
ALICE DAQALICE DAQ
• The original and updated requirements– The original requirements– The updated requirements:
higher multiplicity, addition of detector• Future challenges
– The Region-Of-Interest readout– Online filtering– Enhanced data compression– The new architecture
• Current prototyping status– The ALICE DATE– Data transfer, Sub-event building and event building– Mass Storage System and Permanent Data Storage– The ALICE Data Challenge
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 14
The ALICE DATEThe ALICE DATE
• DATE: Data Acquisition and Test Environment
Software framework for the ALICE DAQ development & prototyping• Cover multiple needs with one common DAQ system
– Need for a system to develop the DAQ– Need for a system for detector tests (lab and test beams)– Need for a framework to develop readout and monitoring programs
• ALICE DATE– Data flow: multiple LDCs, multiple GDCs– Run control, error reporting, bookeeping– Common software interfaces for readout, online monitoring with ROOT– Independent from physical layers: LDC I/O bus, event building network,
GDC machine• Used by ALICE test beams, NA57 and COMPASS
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 15
Data transfer
Data transfer
Front-end electronicsFront-end electronics
DetectorData Links
DDL SIUDDL SIU
DDL DIUDDL DIU
RORCRORC
SourceInterfaceUnit
DestinationInterfaceUnitRead
OutReceiver Card
LocalDataConcentrator
Front-EndDigitalCrate/Computer
P2 Cavern
P2 Accessshaft
Optical Fibre200 meters
LDCLDC
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 16
Detector Data Link (DDL)
Detector Data Link (DDL)
• One-to-one data communication link (FEE and DAQ)• Main Requirements
– Common interface between detector front-end electronics and DAQ• Single hardware & software to develop and maintain
• Define interface soon to allow all the teams to work in parallel
– Raw data transfer to DAQ– Data blocks download to FEE– Cover the distance from the detector in the cavern to the ALICE
computing room in the access shaft (200 m.)• Implementation
– Optical link– Off-The-Shelf components Gbit/s opto-electronic– Prototypes integrated with DATE.– Tests with detectors will start this year
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 17
DDL SIU prototypeDDL SIU prototype
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 18
DDL DIU prototypeDDL DIU prototype
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 19
RORC prototypeRORC
prototype
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 20
Sub-event building
Sub-event building
• Many-to-one data collection inside a crate or a computer– Collect data from several data sources over computer I/O bus– Assemble these data as one sub-event from a fraction of detector– Can work as a standalone DAQ system
• Data sources– Current data sources are electronics cards in VME or Camac– Current sub-event building done in software by a DATE program
(Readout) running in on the processor in a VME board– In the future: data sources will be DDL links– First RORC prototypes done in VME form-factor (1 VME board)– Second RORC prototype will be in PCI form-factor (1 PC adapter)– Following closely the industry evolution (PCI, PCIX, NGIO, FIO, SIO)
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 21
Event buildingEvent
building• Event building network was initially a specially demanding application• Today’s dominant trends in computing and networking industry:
– Internet is the strongest incentive for always higher bandwidth– Commodity computing and networking is driving the industry– Switches replace shared media– Ethernet is the standard LAN media, TCP/IP is the standard protocol– Ethernet’s successors have the advantage of the existing installed base
• Event building network is similar to the backbone of a site like CERN:
Ports: 15000 on Eth10, 2000 sur Eth 100, 30 on Eth 1000,
Switches: 100 Eth100 or Eth1000, central bw 60 Gbps
• Work focus: can we use standard LAN media and protocol and how ?
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 22
Mass Storage System
Mass Storage System
Network
MSS
MSS core
server
MSS Metadata
ALICEDAQ
TapeArrays
DiskArrays
Main DataServers
GDC GDCGDC
Secondary DataServers
NFSServers
DFSServers
ClientSystem
ClientSystem
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 23
Mass Storage System
Mass Storage System
– Isolate the DAQ and computing architecture from • the problems of physical data recording, CDR, volume handling etc• the technology evolution in the storage area (magnetic/optical, robotic etc)
– Provide a logical structure to the storage infrastructure• For example a file system:
/hpss/alice/2005/pbpb_run/run00001.raw/hpss/alice/2005/pbpb_run/run00002.raw...
– The MSS currently used by ALICE is HPSSbut HPSS expensive and supported on a limited set of platforms,
but MSS market is small– Other systems used in future prototypes: CASTOR, EUROSTORE
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 24
Permanent Data Storage
Permanent Data Storage
• Multiple parallel streams of magnetic tapes– By LHC startup:
• Standard drive should achieve 30-50 MBytes/s• Standard capacity should be 100-200 GBytes• 40 drives with 80 (dis)mounts/hour in total
• Current CERN installation– Drive bandwidth 10 MBytes/s– Tape capacity 50 GBytes– 45 drives– 6 silos of 6000 cartridges of 50 GB: 1.8 PBytes capacity
• Feasible but expensive solution.
Ratio disk storage cost/tape storage cost decreasing rapidly
By LHC time, online disk storage and offline archiving could be cost effective
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 25
Prototyping
Prototyping
• Development and prototyping progressing in parallel• Prototyping with the ALICE Data Challenge (ADC)
– Combined activity of the ALICE DAQ/ALICE Offline/IT teams– ADC1: 6 days at 14 MB/s (7 TB ROOT dataset)
• ADC2: Large DATE system– Data sources (18 LDCs)
• 9 Motorola VME + 2 IBM WS
(Test beam area - Hall 887 on Prevessin site)• 7 Motorola VME + DDL prototypes
(DAQ Lab - Bld 53 on Meyrin site)– Network: Fast Ethernet switches, gigabit ethernet backbone
– Data destinations (computing center)• 20 PC/Linux for event building, ROOT I/O formatting, L3 filter• Central data recording (Target 100 MB/s)
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 26
ALICE Data Challenge II
ALICE Data Challenge II
LDC: Local Data ConcentratorGDC: Global Data CollectorCDR: Central Data Recording
CDR
SWITCH
Fast Ethernet switch(3COM3900)24x100 Base-T1x1000 Base-T uplink
CONTROL
MONITOR
ComputingCenterBld 513
Gbit Eth1 Gbit/s
SWITCH
LDCLDCLDC LDCLDC
GDC
LDCLDC LDCLDC
Gbit Eth1 Gbit/s
Fast Ethernet switch(3COM3900)24x100 Base-T1x1000 Base-T uplink
S W IT C H
SWITCH
ALICEDAQ Lab
Bld 53
ALICETest Beam
Exp. Hall 887
GDC GDC GDC GDC GDC GDC GDC GDC GDCGDC GDC GDC GDC GDC GDC GDC GDC GDC GDC
CDR CDRCDR
DDL
RORC RORC
DATASOURCE
DATASOURCE
P. Vande Vyvre CERN/EPThe ALICE DAQ : Current Status and Future Challenges 07-Feb-2000 27
Conclusion
Conclusion
• The requirements of the ALICE DAQ have evolved a lot• New ways to reduce the huge data volume will be investigated
– Region-Of-Interest readout– Online filtering– Enhanced data compression scheme
• Development progressing (almost according to schedule)
The prototypes are tested during the ALICE Data Challenges• Future milestones:
– Integration of DDL with detectors– ALICE Data Challenge II: from DDL to MSS @ 100MB/s
• Computing and communication technology evolution positive• Area of concerns: Storage cost and Mass Storage System