CLAS12 Calibration Challenge: Preparation, Goals, Metrics, and Team Analysis

Slide Note
Embed
Share

The CLAS12 Calibration Challenge involves preparing and testing calibration suites, running calibration code on simulated data, and comparing calibration constant values to extracted ones for successful calibration. Goals include testing individual calibration suites, the CLAS12 calibration procedure, and the work team organization. Success metrics involve comparing reconstruction results before and after calibration for key observables. The team comprises various members responsible for different aspects of the calibration process, such as event generation, database management, simulation, and reconstruction.


Uploaded on Oct 07, 2024 | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. Download presentation by click this link. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

E N D

Presentation Transcript


  1. Calibration Challenge Preparation

  2. Calibration Challenge Goals & Metrics GOALS: Test individual calibration suites Test the overall CLAS12 calibration procedure, i.e. the sequence and the interdependencies of the calibration steps Test the work team organization HOW: Run calibration code on simulated data to extract calibration constants (calibration challenge) Use extracted constants to reconstruct data and obtain KPP plots (KPP challenge) METRICS FOR SUCCESS: Compare calibration constant values used in simulations to extracted ones Compare reconstruction results before and after calibration for key observables (e.g. timing resolution for FTOF, )

  3. Pseudo-Data Generation Event generator: Pythia (low Q2) at 11 GeV 10 M events with electron-hadron within CLAS12-FD acceptance Who/When: Harut, ready by mid November GEMC configuration: 11 GeV beam on LH2 target with full detector (SVT?) and magnets at full field Hadronic events + BG at >=1034 luminosity Who/When: Mauri: GEMC gcards, ready by mid-November Nathan: simulations on the farm, end of November Test sample available to be done

  4. Team Analysis Coordinator: Raffaella De Vita (INFN) & Dan Carman (JLab) DB Manager: Maurizio Ungaro (JLab) Chef: Nathan Harrison (JLab) Calibrators: EC-PCAL: Cole Smith (UVA/JLab) FTOF&CTOF: Louise Clark (Glasgow) LTCC: Maurizio Ungaro (JLab) DC: Krishna Adhikari (Mississippi) HTCC: Nick Markov (UConn) and Will Phelps (FIU) FT-Cal: Erica Fanchini (INFN) FT-Hodo: Gary Smith (Edinburgh) SVT: Yuri Gotra (JLab) CND: Gavin Murdoch (Glasgow)

  5. DB, Simulation, Reconstruction, Calibration CCDB: Realistic values (Run 11) Distorted values for calibration constants and dead channels (Run 17); distorted values generated based on realistic values for mean and sigma with Gaussian spread (extracted constants will go in Run 18) Who/When: detector groups provide mean and sigma; Mauri and Nathan will put values into DB by mid-November SQLITE: Provide SQLITE file to calibrators for final testing Who/When: Harut, mid-November GEMC digitization: Finalize digitization routines Read constants from DB Who/When: detector group under Mauri s supervision, by mid-November Reconstruction: Update reconstruction packages to match final digitization routines and read constants from DB Who/When: detector groups under Veronique s supervision by end of November Calibration suites: Frozen version for challenge participation tested on simulated data Who/When: detector groups, end of November In progress to be done In progress

  6. Challenge Preparation Tasks Detector KPP challenge Calibration challenge DB tables DB realistic constant values DB miscalibrated constants GEMC digitization Reconstruction reading from CCDB EC/PCAL FTOF Dan/Nathan LTCC TBC ? ? ? ? ? DC HTCC mid-November ? FT-Cal Erica/Raffaella Erica/Raffaella Erica/Raffaella FT-Hodo Gary/Raffaella Gary/Raffaella Gary/Raffaella Gary/Raffaella SVT Nathan/Yuri Nathan/Yuri Nathan/Yuri Nathan/Yuri CTOF Dan/Nathan CND Silvia/Raffaella Silvia/Raffaella Silvia/Raffaella Silvia/Raffaella ?

  7. Work Organization Data generated by end of November and checked running reconstruction Data sample with constants provided from Run 11 provided to calibrators for final test on December 5 Data (simulated and reconstructed, if necessary) made available to calibrators on Day 1 First iteration by Day 2: Detector component status (all) Energy/gain calibration (FTOF&CTOF, LTCC, FT-Cal, FT-Hodo, CND) Second iteration by Day 3: Refinement of energy/gain calibration Timing calibration from FTOF&CTOF Third iteration by Day 4: Extend timing calibration to other detectors KPP plots Fourth iteration by Day 5: KPP plots Data unblind on Day 7: Comparison of constants Comparison of reconstruction results with original and extracted constants

  8. Questions for Calibrators? Calibration suite status: Will the suite be ready by end of November? Is the suite reading GEMC data? Was this tested? Which steps of the calibration process will be part of the challenge? (be specific) (Pseudo)data: do you need only simulated or also reconstructed data? If yes, what banks? Are you planning to transfer the data to your personal workstation/laptop/ ? Will you need to recook data in between different calibration passes? CCDB: Do you have experience on reading/writing from/to CCDB? Are you familiar with SQLITE copies of CCDB? How do you expect to provide the calibration constants resulting from your work? Results: Have you defined metrics to judge the calibration result quality? Have you defined monitoring plots? Documentation: What is the status of documentation and tutorials for your calibration suite? Are you ready to document the work that will be done during the challenge to contribute to the final report?

Related