Data Processing in the Physical Experiments
- Department of Physics
Basic course of informatics in physics gives an overview of experimental data analysis methods in particle physics experiments. Data analysis on each level of the experiment is shown to students.
Knowledge of basic course of physics, detectors and electronics
- Syllabus of lectures:
1. Structure of data flow in experiment, needs for data processing at different levels
2. Signal at the detector level, data form for different detectors, digitization, compression, selection of useful signal.
3. Data transformations used for data compression and analysis
4. Data transport from detector to the data acquisition system, event building, events, subevents, format DATE.
5. Event selection - Trigger. Trigger levels.
6. Selection algorithms in L2 trigger and in the High level trigger.
7. Storage of experimental data, disc, tape, case study ALICE
8. Quality and consistence check - online and offline processing
9. Computational model of the experiment in callibrations and physics analysis
10. Distributed computing in data analysis - GRID
11. Usage of the GRID system, certificates, architecture of the ALIEN system
12. Interactive distributed analysis in PROOF
- Syllabus of tutorials:
- Study Objective:
Signal and data processing in the experiment - from the signal level - to analysis on distributed GRID.
Orientation in the construction and operation of high energy particle experiments
- Study materials:
 Kenniston W. Lord: CDP Review Manual: A Data Processing Handbook, Van Nostrand Reinhold, 1986
 ALICE DAQ Project, ALICE DAQ and ECS Users Guide, CERN Note, 2006
 Federico Carminati, Overview of the Computing framework - ALICE TDR, CERN Note, 2006
- Further information:
- No time-table has been prepared for this course
- The course is a part of the following study plans: