sPHENIX DAQ/Trigger System Overview

sPHENIX 
 Director’s 
Review
DAQ/Trigger
Ed Desmond
(standing in for Martin Purschke)
April 9-11, 2019
BNL
April 9-11, 2019
sPHENIX Director's Review
1
2
The DAQ/Trigger Subsystem
May 23-25, 2018
sPHENIX DOE-SC CD-1/3A Review
L2 Manager: Martin L Purschke, BNL
DAQ System Requirements
Configure detectors for operation
Control / Monitor detector systems
Readout detectors @ 15 kHz event rate @ 90%
livetime
Stored data @ ~ 135 Gbit/sec to HPSS
Timing/Trigger
DAQ Technical Overview
April 9-11, 2019
sPHENIX Director's Review
3
RHIC Clock
DAQ interfaces to all detector
systems built and working
Calorimeter, TPC, MVTX
detectors successfully read out
Integration of separate DAQ
components into an overall DAQ
architecture now in development
I
n
t
e
r
a
c
t
i
o
n
 
R
e
g
i
o
n
R
a
c
k
 
R
o
o
m
To
HPSS
Slow
 
Controls
vGTM
GL1
MTM
LL1
Trigger
sPHENIX Trigger System
Scope
4
April 9-11, 2019
sPHENIX Director's Review
Deliverables:
Take 15 kHz event rate with 90% live time
store and transfer ~135 Gbit/s data
Provide sufficient rejection with Local Level 1 to get to 15 kHz
Configuration, Control and Monitoring system software
Hardware and procurements
Data Collection Modules ( DCM-II)  25 needed
SubEvent Buffers (SEB) (18) : commodity PC
Buffer Boxes (6) : commodity PC/File server
40 Gbit/sec per BufferBox
JSEB2 input cards (33) ( 40% in hand) custom board
Control Machines  - control, monitoring (4) : commodity PC
Global level 1 system Xilinx FPGA based board in hand
Timing system Xilinx FPGA based board in hand
Network switches (one or two) : commodity switch
5
May 23-25, 2018
sPHENIX DOE-SC CD1/3A Review
6
DAQ Technical Status
May 23-25, 2018
sPHENIX DOE-SC CD1/3A Review
Data Collection Modules (DCM-II) reused from PHENIX with well known functional
characteristics
sPHENIX Digitizer electronics already integrated into DAQ readout system
Tested extensively and used at the FermiLab beam test Feb – May 2018
Significant progress made with TPC readout with ATLAS Felix card
8 Sampa chip front-end card read out into FELIX cards
FELIX  card integrated into our standard RCDAQ software and successfully read out
Global Level 1
New Xilinx  FPGA based boards in hand
Basic operational functions now in place and used to trigger TPC readout
SubEvent Buffers (SEB ) , BufferBox components are commercial PCs
Reuse of PHENIX  control software where appropriate
Readout of detector monitor data ( temp, DAC gain, status) available and in routine use
( extensively used at the Fermilab test beam with the Calorimeter Control Board)
7
GL1 / Timing system
sPHENIX DOE-SC CD-1/3A Review
May 23-25, 2018
FPGA board selected (same for both GL1 and Timing system, different
firmware)
Based on in-house development for the NSLS-2, design under our control,
IP under our control, boards readily available
Zynq FPGA at its core, provides additional CPUs running Linux for all
slow controls/configuration needs
Plenty of I/O blocks to cater to early R&D-themed needs
 “Granule Timing Module” (GTM) firmware in use with TPC front end
Slow Controls
April 9-11, 2019
sPHENIX Director's Review
8
Configuration and control of Emcal, Hcal, and MBD digitizer and
DCM-II components  implemented through standard JSEB-II
components
Used extensively in the PHENIX project
Well understood and established communication protocol
Detector Component Monitoring
Readout of Calorimeter temperature, DAC gains, and access to
status parameters is implemented through in house developed
Calorimeter Control Module
HV control via Weiner-ISEG and LeCroy HV (MBD)
ADAM/MODBUS Low Voltage control, reuse from PHENIX
Development of a standard API for these devices is in
development
9
Subsystem Collaborators
April 9-11, 2019
sPHENIX Director's Review
1.6  Data Acquisition and Trigger  (M. Purschke, BNL)
1.6.1   The core DAQ system   (Ed Desmond, BNL)
1.6.2   Local-Level 1 Triggers  (Jamie Nagle, U Colorado)
1.6.3   Global Level 1 Trigger  (Ed Desmond, BNL)
1.6.4   Timing system (M. Purschke )
Joe Mead, BNL/instrumentation  (Timing)
John Kuczewski, BNL/instrumentation  (Timing,GL1, Time Projection Chamber interfaces)
Cheng-Yi Chi, Columbia/Nevis  (Local Level-1)
Dennis Perepelitsa, U Colorado  (Local Level-1)
Schedule Drivers
sPHENIX DOE-SC CD1/3A Review
10
May 23-25, 2018
Cost Drivers
April 9-11, 2019
sPHENIX Director's Review
11
DCM2 Components  $120.6k
8 DCM2 cards @ $12000/ea  + 6 JSEB2 cards @ $4100
Commodity PCs  $48.26k
18 SEBs @ $2280
4 JSEB2 hosts @ $2107
Buffer Boxes $268k
6  RAID boxes @ 44708
Network switch $79.8k
2 10Gb switches
Cost Distribution WBS 1.6 DAQ/Trigger
WBS 1.06 Estimate of Uncertainty
April 9-11, 2019
sPHENIX Director's Review
14
Status and highlights
We can already read the main components (calorimeters, Time Projection
Chamber, …) with actual sPHENIX hardware (DCM2s, TPC front-end
boards, etc)
We have a “legacy” high-end 10Gbit/s network switch available for
development
We have enough PCs in hand for all development and similar tasks
Local-level-1 technology selected
The Global Level-1 and Timing technology is selected, procured and
development underway and partially implemented
Final data format defined and software already implemented
April 9-11, 2019
sPHENIX Director's Review
15
GL1 and Timing system (GTM) Status
 
We have demonstrated the readout of the TPC front-end card under the
control of the timing system (GTM, Granule Timing Module)
GTM distributes a trigger and started the readout
Can in turn be controlled by a GL1 unit
Timing System
Felix Card
Timing fiber      Trigger Signal
April 9-11, 2019
sPHENIX Director's Review
16
GL1 and Timing system TPC triggering
 
Test stand readout of the TPC front-end card with scope trace of GTM to FEE
trigger signal
GTM/FEE trigger trace
Test stand for FEE/GTM
    1.6 DAQ/Trigger Risks
April 9-11, 2019
17
sPHENIX  Director's Review
Issues and Concerns
April 9-11, 2019
18
sPHENIX Director's Review
The TPC data rate could be higher than currently estimated from
simulations
This may lead to more expensive network switches and more file servers,
or reduced buffering depth
Summary
April 9-11, 2019
sPHENIX Director's Review
19
DAQ interface to all sPHENIX detector sub-systems in place and functional
The control and readout of the  TPC, MVTX and Calorimeter detectors has been
successfully implemented using the standard DAQ software
Calorimeter detectors successfully read out at Feb 2018 Fermilab test beam
A baseline Global level 1 has been implemented, including readout of trigger
parameters
TPC and MVTX front-ends have used the GL1 information transmitted through the
GTM timing system
DAQ system used for all tests implements standard sPHENIX data formats, allowing
re-use of monitoring and analysis software.
The sPHENIX DAQ system is ready for PD-2/3
Back Up
April 9-11, 2019
20
sPHENIX Director's Review
Buffering and average data rates
May 23-25, 2018
sPHENIX DOE-SC CD-1/3A Review
21
We have ”built in” a factor of 0.6 for the combined sPHENIX x RHIC uptime (optimistic)
With buffering, we can take ~1.5 times the HPSS logging rate 
The longer you can buffer, the higher the local rate can get
2016 PHENIX numbers: 
 
12 Gbit/s local max from the DAQ
   
uptime: ~ 50%
   
12 hour average 6 Gbit/s
   
2-day average 4 Gbit/s
   
week < 3 Gbit/s
   
entire Run  ~2Gbit/s
You average over machine-
development days, short accesses,
APEX days, and other small no-
beam periods
Slide Note
Embed
Share

The sPHENIX DAQ/Trigger system is the focus of this review, covering the hardware, procurements, technical status, and deliverables of the system. Key components include Data Collection Modules (DCM-II), SubEvent Buffers (SEB), control machines, and global level 1 system. The system aims to achieve a 15 kHz event rate with 90% live time, store and transfer around 135 Gbit/s of data, and provide sufficient rejection with local level 1. Integration of separate DAQ components into an overall architecture is in development, with existing systems successfully reading out detectors like Calorimeter, TPC, and MVTX.

  • sPHENIX
  • DAQ/Trigger
  • System Overview
  • Data Collection Modules
  • Event Rate

Uploaded on Nov 23, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. sPHENIX Directors Review DAQ/Trigger Ed Desmond (standing in for Martin Purschke) April 9-11, 2019 BNL 1 sPHENIX Director's Review April 9-11, 2019

  2. The DAQ/Trigger Subsystem L2 Manager: Martin L Purschke, BNL DAQ System Requirements Configure detectors for operation Control / Monitor detector systems Readout detectors @ 15 kHz event rate @ 90% livetime Stored data @ ~ 135 Gbit/sec to HPSS Timing/Trigger 2 sPHENIX DOE-SC CD-1/3A Review May 23-25, 2018

  3. DAQ Technical Overview DAQ interfaces to all detector systems built and working Calorimeter, TPC, MVTX detectors successfully read out Integration of separate DAQ components into an overall DAQ architecture now in development DCM FEM DCM DCM2 DCM DCM DCM FEM Buffer Box DCM DCM DCM2 SEB DCM DCM DCM FEM Buffer Box DCM DCM DCM SEB Buffer Box DCM DCM To HPSS 40+ Gigabit Crossbar DCM DCM2 DCM DCM SEB Buffer Box DCM FEE Switch DCM DCM DCM FEE DAM EBDC Buffer Box DCM DCM DCM FEE DAM EBDC Buffer Box DAM EBDC DCM DCM Buffer Box Interaction Region Rack Room sPHENIX Trigger System Slow Controls vGTM LL1 Trigger MTM GL1 RHIC Clock 3 sPHENIX Director's Review April 9-11, 2019

  4. Scope Deliverables: Take 15 kHz event rate with 90% live time store and transfer ~135 Gbit/s data Provide sufficient rejection with Local Level 1 to get to 15 kHz Configuration, Control and Monitoring system software 4 sPHENIX Director's Review April 9-11, 2019

  5. Hardware and procurements Data Collection Modules ( DCM-II) 25 needed SubEvent Buffers (SEB) (18) : commodity PC Buffer Boxes (6) : commodity PC/File server 40 Gbit/sec per BufferBox JSEB2 input cards (33) ( 40% in hand) custom board Control Machines - control, monitoring (4) : commodity PC Global level 1 system Xilinx FPGA based board in hand Timing system Xilinx FPGA based board in hand Network switches (one or two) : commodity switch 5 sPHENIX DOE-SC CD1/3A Review May 23-25, 2018

  6. DAQ Technical Status Data Collection Modules (DCM-II) reused from PHENIX with well known functional characteristics sPHENIX Digitizer electronics already integrated into DAQ readout system Tested extensively and used at the FermiLab beam test Feb May 2018 Significant progress made with TPC readout with ATLAS Felix card 8 Sampa chip front-end card read out into FELIX cards FELIX card integrated into our standard RCDAQ software and successfully read out Global Level 1 New Xilinx FPGA based boards in hand Basic operational functions now in place and used to trigger TPC readout SubEvent Buffers (SEB ) , BufferBox components are commercial PCs Reuse of PHENIX control software where appropriate Readout of detector monitor data ( temp, DAC gain, status) available and in routine use ( extensively used at the Fermilab test beam with the Calorimeter Control Board) 6 sPHENIX DOE-SC CD1/3A Review May 23-25, 2018

  7. GL1 / Timing system FPGA board selected (same for both GL1 and Timing system, different firmware) Based on in-house development for the NSLS-2, design under our control, IP under our control, boards readily available Zynq FPGA at its core, provides additional CPUs running Linux for all slow controls/configuration needs Plenty of I/O blocks to cater to early R&D-themed needs Granule Timing Module (GTM) firmware in use with TPC front end 7 sPHENIX DOE-SC CD-1/3A Review May 23-25, 2018

  8. Slow Controls Configuration and control of Emcal, Hcal, and MBD digitizer and DCM-II components implemented through standard JSEB-II components Used extensively in the PHENIX project Well understood and established communication protocol Detector Component Monitoring Readout of Calorimeter temperature, DAC gains, and access to status parameters is implemented through in house developed Calorimeter Control Module HV control via Weiner-ISEG and LeCroy HV (MBD) ADAM/MODBUS Low Voltage control, reuse from PHENIX Development of a standard API for these devices is in development 8 sPHENIX Director's Review April 9-11, 2019

  9. Subsystem Collaborators 1.6 Data Acquisition and Trigger (M. Purschke, BNL) 1.6.1 The core DAQ system (Ed Desmond, BNL) 1.6.2 Local-Level 1 Triggers (Jamie Nagle, U Colorado) 1.6.3 Global Level 1 Trigger (Ed Desmond, BNL) 1.6.4 Timing system (M. Purschke ) Joe Mead, BNL/instrumentation (Timing) John Kuczewski, BNL/instrumentation (Timing,GL1, Time Projection Chamber interfaces) Cheng-Yi Chi, Columbia/Nevis (Local Level-1) Dennis Perepelitsa, U Colorado (Local Level-1) 9 sPHENIX Director's Review April 9-11, 2019

  10. Schedule Drivers Task Finish date GL1 Prototype Ready Dec 2018 DAQ prototype ready for production Aug 2019 Timing system prototype Sep 2019 Local Level-1 prototype v1 ready April 2020 Local Level-1 preproduction prototype Oct 2020 DAQ components ready for operation Dec 2020 Timing system ready Dec 2020 GL1 ready Mar 2021 Local Level-1 ready Jul 2021 10 sPHENIX DOE-SC CD1/3A Review May 23-25, 2018

  11. Cost Drivers DCM2 Components $120.6k 8 DCM2 cards @ $12000/ea + 6 JSEB2 cards @ $4100 Commodity PCs $48.26k 18 SEBs @ $2280 4 JSEB2 hosts @ $2107 Buffer Boxes $268k 6 RAID boxes @ 44708 Network switch $79.8k 2 10Gb switches 11 sPHENIX Director's Review April 9-11, 2019

  12. Cost Distribution WBS 1.6 DAQ/Trigger

  13. WBS 1.06 Estimate of Uncertainty

  14. Status and highlights We can already read the main components (calorimeters, Time Projection Chamber, ) with actual sPHENIX hardware (DCM2s, TPC front-end boards, etc) We have a legacy high-end 10Gbit/s network switch available for development We have enough PCs in hand for all development and similar tasks Local-level-1 technology selected The Global Level-1 and Timing technology is selected, procured and development underway and partially implemented Final data format defined and software already implemented 14 sPHENIX Director's Review April 9-11, 2019

  15. GL1 and Timing system (GTM) Status We have demonstrated the readout of the TPC front-end card under the control of the timing system (GTM, Granule Timing Module) GTM distributes a trigger and started the readout Can in turn be controlled by a GL1 unit Felix Card Timing System EBDC Timing fiber Trigger Signal 15 sPHENIX Director's Review April 9-11, 2019

  16. GL1 and Timing system TPC triggering Test stand readout of the TPC front-end card with scope trace of GTM to FEE trigger signal GTM/FEE trigger trace Test stand for FEE/GTM 16 sPHENIX Director's Review April 9-11, 2019

  17. 1.6 DAQ/Trigger Risks 17 sPHENIX Director's Review April 9-11, 2019

  18. Issues and Concerns The TPC data rate could be higher than currently estimated from simulations This may lead to more expensive network switches and more file servers, or reduced buffering depth 18 sPHENIX Director's Review April 9-11, 2019

  19. Summary DAQ interface to all sPHENIX detector sub-systems in place and functional The control and readout of the TPC, MVTX and Calorimeter detectors has been successfully implemented using the standard DAQ software Calorimeter detectors successfully read out at Feb 2018 Fermilab test beam A baseline Global level 1 has been implemented, including readout of trigger parameters TPC and MVTX front-ends have used the GL1 information transmitted through the GTM timing system DAQ system used for all tests implements standard sPHENIX data formats, allowing re-use of monitoring and analysis software. The sPHENIX DAQ system is ready for PD-2/3 19 sPHENIX Director's Review April 9-11, 2019

  20. Back Up 20 sPHENIX Director's Review April 9-11, 2019

  21. Buffering and average data rates We have built in a factor of 0.6 for the combined sPHENIX x RHIC uptime (optimistic) With buffering, we can take ~1.5 times the HPSS logging rate The longer you can buffer, the higher the local rate can get 2016 PHENIX numbers: 12 Gbit/s local max from the DAQ uptime: ~ 50% You average over machine- development days, short accesses, APEX days, and other small no- beam periods 12 hour average 6 Gbit/s 2-day average 4 Gbit/s week < 3 Gbit/s entire Run ~2Gbit/s 21 sPHENIX DOE-SC CD-1/3A Review May 23-25, 2018

More Related Content

giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#giItT1WQy@!-/#