Jonas Glombitza (Avtor), Andrej Filipčič (Avtor), J. P. Lundquist (Avtor), Samo Stanič (Avtor), Serguei Vorobiov (Avtor), Danilo Zavrtanik (Avtor), Marko Zavrtanik (Avtor), Lukas Zehrer (Avtor)

Povzetek

The measurement of the mass composition of ultra-high energy cosmic rays constitutes a prime challenge in astroparticle physics. Most detailed information on the composition can be obtained from measurements of the depth of maximum of air showers, Xmax, with the use of fluorescence telescopes, which can be operated only during clear and moonless nights. Using deep neural networks, it is now possible for the first time to perform an event-by-event reconstruction of Xmax with the Surface Detector (SD) of the Pierre Auger Observatory. Therefore, previously recorded data can be analyzed for information on Xmax, and thus, the cosmic-ray composition. Since the SD operates with a duty cycle of almost 100% and its event selection is less strict than for the Fluorescence Detector (FD), the gain in statistics with respect to the FD is almost a factor of 15 for energies above 10^19.5 eV. In this contribution, we introduce the neural network particularly designed for the SD of the Pierre Auger Observatory. We evaluate its performance using three different hadronic interaction models, verify its functionality using Auger hybrid measurements, and find that the method can extract mass information on an event level.

Ključne besede

Pierre Auger Observatory;indirect detection;surface detection;ground array;ultra-high energy;cosmic rays;composition;neural network;machine learning;

Podatki

Jezik: Angleški jezik
Leto izida:
Tipologija: 1.08 - Objavljeni znanstveni prispevek na konferenci
Organizacija: UNG - Univerza v Novi Gorici
UDK: 539.1
COBISS: 166307331 Povezava se bo odprla v novem oknu
ISSN: 1824-8039
Št. ogledov: 31
Št. prenosov: 0
Ocena: 0 (0 glasov)
Metapodatki: JSON JSON-RDF JSON-LD TURTLE N-TRIPLES XML RDFA MICRODATA DC-XML DC-RDF RDF

Ostali podatki

Vrsta dela (COBISS): Delo ni kategorizirano
Strani: str. 1-12
Čas izdaje: 2022
DOI: 10.22323/1.395.0359
ID: 20010305