CERN Accelerating science

Posters

Ultimi arrivi:
2021-11-26
14:53
CERN - FCC exhibition
CERN - FCC Exhibition

Reference: Poster-2021-1057
Keywords:  FCC  Exhibition
Created: 2021. -36 p

Following the adoption of the European Strategy for Particle Physics Update by the CERN Council in 2020, CERN will investigate the technical and financial feasibility study for a Future Circular Collider. This exhibition presents photographs and information about this future project which would make possible to maintain the preeminence of Europe in the field of particle physics and to push back the frontiers of knowledge.Suite à l'adoption de la mise à jour de la stratégie européenne pour la physique des particules par le Conseil du CERN en 2020, le CERN a été chargé d'une étude de faisabilité technique et financière sur un Futur Collisionneur Circulaire. Cette exposition présente des photographies et informations sur ce futur projet qui permettrait de maintenir la prééminence de l’Europe dans le domaine de la physique des particules et de repousser les frontières de la connaissance.

© CERN Geneva

Access to files

Record dettagliato - Record simili
2021-11-26
10:28
NEW GENERATION OFFLINE SOFTWARE FOR THE LHCb UPGRADE I
Reference: Poster-2021-1056
Created: 2021. -1 p
Creator(s): Ferrillo, Martina

The LHCb detector is undergoing a comprehensive upgrade for data taking in the LHC's Run 3, which is scheduled to begin in 2022. The increased data rate in Run 3 poses significant data-processing and handling challenges for the LHCb experiment. The offline computing and dataflow model is consequently also being upgraded to cope with the factor 30 increase in data volume and associated demands of user-data samples of ever-increasing size. Coordinating these efforts is the charge of the newly created Data Processing and Analysis (DPA) project. The DPA project is responsible for ensuring the LHCb experiment can efficiently exploit the Run 3 data, dealing with the data from the online system with central skimming/slimming (a process known as "Sprucing") and subsequently producing analyst-level ntuples with a centrally managed production system (known as "Analysis Productions") utilising improved analysis tools and infrastructure for continuous integration and validation.It is a multi-disciplinary project involving collaboration between computing experts, trigger experts and physics analysis experts. This talk will present the evolution of the data processing model, followed by a review of the various activities of the DPA project. The associated computing, storage and network requirements are also discussed.

Related links:
11th LHC students poster session
© CERN Geneva

Access to files

Record dettagliato - Record simili
2021-11-26
10:25
Quantum Machine Learning at LHCb
Reference: Poster-2021-1055
Created: 2021. -1 p
Creator(s): Nicotra, Davide

At the LHCb experiment, it is mandatory to identify jets produced by $b$ and $\bar{b}$ quarks (b-jet charge tagging), since it is fundamental in several Physics studies, e.g. the measurement of the $b$-$\bar{b}$ production asymmetry, which could be sensitive to New Physics channels. Being a classification problem, Machine Learning techniques, such as Deep Neural Networks, have been used to solve this problem. In this work, we present a new approach to b-jet charge tagging based on Quantum Machine Learning techniques, trained on LHCb simulated data. Performance comparisons with other classical algorithms are also presented.

Related links:
11th LHC students poster session
© CERN Geneva

Access to files

Record dettagliato - Record simili
2021-09-24
14:15
Investigation of Radiation-Induced Effects in a Front-end ASIC Designed for Photon Counting Sensor Systems
Reference: Poster-2021-1054
Created: 2021. -1 p
Creator(s): Placinta, Vlad-Mihai

This work outlines the measurements done to evaluate the second SPACIROC generation in ionizing radiation environments, i.e., particle beams: ions, protons, and X-rays. The SPACIROCs are front-end ASICs designed for the readout requirements of photomultiplier technologies like: SiPMs, MaPMTs. Several radiation-induced effects were observed but they proved to be benign application-wise. The threshold LET for SEUs was measured and two cross-sections for different LETs are provided. At extremely high dose rates (~100 rad/s) and TID above 50 krad proton/X-ray induced TID effects were observed, however a room-temperature annealing process was determined to mitigate the harmful TID effects in 24 hours.

© CERN Geneva

Access to files

Record dettagliato - Record simili
2021-09-08
16:56
QCD physics measurements at the LHCb experiment
Reference: Poster-2021-1053
Created: 2021. -1 p
Creator(s): Zuliani, Davide

LHCb is a spectrometer that covers the forward region of proton-proton collisions, corresponding to the pseudo-rapidity range 2<eta<5. In this unique phase space, LHCb can perform tests of perturbative and non-perturbative QCD models, by studying the production of heavy flavor quarks, like charm and top quarks. In this context the production of a Z boson in association with a c-jet can be studied to measure the intrinsic charm content of the proton. Moreover LHCb can test phenomenological models of soft QCD processes, by measuring the production of forward hadrons in pp collisions.

© CERN Geneva

Access to files

Record dettagliato - Record simili
2021-07-30
15:26
Luminosity measurement at LHCb
Reference: Poster-2021-1052
Created: 2021. -1 p
Creator(s): Van Dijk, Maarten

The LHCb detector, designed to measure the decays of heavy hadrons, is a forward-arm spectrometer. Its efficiency can be degraded by collisions with high occupancy: therefore, a technique known as "luminosity levelling" has been used since the start of the LHC Run 1, allowing to control and stabilize the instantaneous luminosity with a precision of 5%. During LHC Runs 1 and 2, this technique employed data from the hardware-based trigger level to determine the instantaneous luminosity. These counters are calibrated in dedicated data taking runs a few times per year. The combination of van der Meer scans and of beam profiles obtained in beam-gas interactions, unique to LHCb, allowed LHCb to obtain in Run 1 the most precise luminosity measurement ever achieved at a bunched hadron collider. During LHC Run 3, the upgraded LHCb detector will see a 5x increase of luminosity. Dedicated luminosity detectors have been designed and are being commissioned for use in Run 3 and Run 4. This talk will review the methods used in Run 1 and introduce the new approach being developed for the coming LHC runs.

Related links:
Conference EPS-HEP2021
© CERN Geneva

Access to files

Record dettagliato - Record simili
2021-07-30
15:06
Real-time alignment procedure at the LHCb experiment
Reference: Poster-2021-1051
Created: 2021. -1 p
Creator(s): Reiss, Florian

The LHCb detector at the LHC is a general purpose detector in the forward region with a focus on studying decays of c- and b-hadrons. For Run 3 of the LHC (data taking foreseen from 2022), LHCb will take data at an instantaneous luminosity of 2 × 10^{33} cm−2 s−1, five times higher than in Run 2 (2015-2018). To cope with the harsher data taking conditions, LHCb will deploy a purely software based trigger with a 30 MHz input rate. The software trigger at LHCb is composed of two stages: in the first stage the selection is based on a fast and simplified event reconstruction, while in the second stage a full event reconstruction is used. This gives room to perform a real-time alignment and calibration after the first trigger stage, allowing to have an offline-quality detector alignment in the second stage of the trigger. The detector alignment is an essential ingredient to have the best detector performance in the full event reconstruction. The alignment of the whole tracking system of LHCb is evaluated in real-time by an automatic iterative procedure. The data collected at the start of the fill are processed in a few minutes to update the alignment before running the second stage of the trigger. This in turn allows the trigger output data to be used for physics analysis without a further offline event reconstruction. The motivation for a real-time alignment of the LHCb detector in Run 3 is discussed from both the technical and operational point of view. Specific challenges of this strategy are presented, as well as the working procedures of the framework.

Related links:
Conference EPS-HEP2021
© CERN Geneva

Access to files

Record dettagliato - Record simili
2021-07-09
17:09
The superthin fixed target for the LHCb experiment in Run4
Reference: Poster-2021-1050
Created: 2021. -1 p
Creator(s): Chernyshenko, Serhii

Fixed target studies at the LHC energies (√ SNN about 70-120 GeV) are considered as a powerful tool for exploring the QCD phase diagram in a weakly known domain of densities and temperatures with variety of possible peculiarities in the EOS in entrance and exit channels in high energy heavy ions collisions. Implementing a gas-filled cell set-up, SMOG2, with a unique feature of data taking in the collider and fixed-target mode, simultaneously, the LHCb Collaboration plans to contribute sig- nificantly here during Run3 [1].

© CERN Geneva

Access to files

Record dettagliato - Record simili
2021-06-14
17:10
Best Practice Guide on Pollutant Dispersion Simulations with the Commercial Tool „ ANSYS Fluent“ at CERN
Reference: Poster-2021-1048
Keywords:  Large Eddy Simulation (LES), neutral Atmospheric Boundary Layer (ABL), Synthetic Turbulence Generator (STG), Monin-Obukhov similarity theory, pollutant dispersion, discrete phase model (DPM), Computational Fluid Dynamics (CFD)
Created: 2021. -7 p
Creator(s): Kauflin, Uwe; Battistin, Michele; La Mendola, Saverio; Leitl, Bernd

Large Eddy Simulations (LES) are of raising interest for numerous engineering applications in which an accurate flow prediction is necessary. This paper searches for the optimum mesh resolution in numerical simulations reliably predicting dispersion of pollutants in the lower part of the Atmospheric Boundary Layer (ABL). For the dispersion of pollutants, turbulent quantities have been assessed at several distances from the release point and compared to each other. Areas close to release points located at low altitudes are given a particular importance, because air pollutant concentrations can be too high for people present at such places. To achieve a realistic prediction of the flow and pollutant concentrations close to populated areas, LES are preferred over the Reynolds-Averaged Navier Stokes (RANS) models (Vita et al, 2020). A mesh resolution of 0.5 m is recommended at distances from the release point shorter than 40 m. Near the release point, physical effects like building downwash and horizontal plume enlargement due to the downstream wake region of buildings have a direct impact on pollutant concentrations and particle trajectories. In built-up areas at intermediary distances where the dispersion of the plume is directly influenced by buildings in their given constellation and where the energy production is high, a mesh resolution of 1.5 m is suggested. In areas where the plume is already dispersed and geometrical obstacles are rare, a mesh resolution of 3 m and more is sufficient. In these areas, the dissipation of energy and the transport of particles (mean quantities) that determine the flow are less affected by the mesh size.

Related links:
20th International Conference on Harmonisation within Atmospheric Dispersion Modelling for Regulatory Purposes
© CERN Geneva

Access to files

Record dettagliato - Record simili
2021-05-28
12:08
A large Scintillating Fibre Tracker for LHCb
Reference: Poster-2021-1047
Created: 2021. -1 p
Creator(s): Berninghoff, Daniel Alexander

The LHCb detector is currently being upgraded to cope with higher instantaneous luminosities and to read out data at 40 MHz using a trigger-less read-out system. The new main tracker consists of 250µm thick scintillating fibres (SciFi) and covers an area of 340 m2. The tracker provides a spatial resolution for charged particles better than 80 µm. The scintillation light is recorded with arrays of multi-channel silicon photomultipliers (SiPMs). A custom ASIC is used to digitize the SiPM signals and subsequent digital electronics performs clustering and data-compression. Single detector modules are mounted on so-called C-frames (3m × 6m) which will provide the mechanical support and the necessary services. The serial assembly of the 12 large frames, each comprising 50,000 SiPM channels, is progressing and the first detector elements have been commissioned. This presentation will cover the development, construction and the commissioning results of the detector.

© CERN Geneva

Access to files

Record dettagliato - Record simili
Concentra su:
Open Days 2013 Posters (58)
Open Days 2019 Posters (299)