Computing for the DUNE Long-Baseline Neutrino Oscillation Experiment

This paper is based on a talk given at Computing in High Energy Physics in Adelaide, South Australia, Australia in November 2019. It is partially intended to explain the context of DUNE Computing for computing specialists. The Deep Underground Neutrino Experiment (DUNE) collaboration consists of ove...

Full description

Bibliographic Details
Main Author: Schellman Heidi
Format: Article
Language:English
Published: EDP Sciences 2020-01-01
Series:EPJ Web of Conferences
Online Access:https://www.epj-conferences.org/articles/epjconf/pdf/2020/21/epjconf_chep2020_11002.pdf
_version_ 1819151878142820352
author Schellman Heidi
author_facet Schellman Heidi
author_sort Schellman Heidi
collection DOAJ
description This paper is based on a talk given at Computing in High Energy Physics in Adelaide, South Australia, Australia in November 2019. It is partially intended to explain the context of DUNE Computing for computing specialists. The Deep Underground Neutrino Experiment (DUNE) collaboration consists of over 180 institutions from 33 countries. The experiment is in preparation now, with commissioning of the first 10kT fiducial volume Liquid Argon TPC expected over the period 2025-2028 and a long data taking run with 4 modules expected from 2029 and beyond. An active prototyping program is already in place with a short test-beam run with a 700T, 15,360 channel prototype of single-phase readout at the Neutrino Platform at CERN in late 2018 and tests of a similar sized dual-phase detector scheduled for mid-2019. The 2018 test-beam run was a valuable live test of our computing model. The detector produced raw data at rates of up to 2GB/s. These data were stored at full rate on tape at CERN and Fermilab and replicated at sites in the UK and Czech Republic. In total 1.2 PB of raw data from beam and cosmic triggers were produced and reconstructed during the six week testbeam run. Baseline predictions for the full DUNE detector data, starting in the late 2020’s are 30-60 PB of raw data per year. In contrast to traditional HEP computational problems, DUNE’s Liquid Argon TPC data consist of simple but very large (many GB) 2D data objects which share many characteristics with astrophysical images. This presents opportunities to use advances in machine learning and pattern recognition as a frontier user of High Performance Computing facilities capable of massively parallel processing.
first_indexed 2024-12-22T14:40:23Z
format Article
id doaj.art-1c2f3fdb5db04ee693ac5fa9d8be5f4c
institution Directory Open Access Journal
issn 2100-014X
language English
last_indexed 2024-12-22T14:40:23Z
publishDate 2020-01-01
publisher EDP Sciences
record_format Article
series EPJ Web of Conferences
spelling doaj.art-1c2f3fdb5db04ee693ac5fa9d8be5f4c2022-12-21T18:22:33ZengEDP SciencesEPJ Web of Conferences2100-014X2020-01-012451100210.1051/epjconf/202024511002epjconf_chep2020_11002Computing for the DUNE Long-Baseline Neutrino Oscillation ExperimentSchellman HeidiThis paper is based on a talk given at Computing in High Energy Physics in Adelaide, South Australia, Australia in November 2019. It is partially intended to explain the context of DUNE Computing for computing specialists. The Deep Underground Neutrino Experiment (DUNE) collaboration consists of over 180 institutions from 33 countries. The experiment is in preparation now, with commissioning of the first 10kT fiducial volume Liquid Argon TPC expected over the period 2025-2028 and a long data taking run with 4 modules expected from 2029 and beyond. An active prototyping program is already in place with a short test-beam run with a 700T, 15,360 channel prototype of single-phase readout at the Neutrino Platform at CERN in late 2018 and tests of a similar sized dual-phase detector scheduled for mid-2019. The 2018 test-beam run was a valuable live test of our computing model. The detector produced raw data at rates of up to 2GB/s. These data were stored at full rate on tape at CERN and Fermilab and replicated at sites in the UK and Czech Republic. In total 1.2 PB of raw data from beam and cosmic triggers were produced and reconstructed during the six week testbeam run. Baseline predictions for the full DUNE detector data, starting in the late 2020’s are 30-60 PB of raw data per year. In contrast to traditional HEP computational problems, DUNE’s Liquid Argon TPC data consist of simple but very large (many GB) 2D data objects which share many characteristics with astrophysical images. This presents opportunities to use advances in machine learning and pattern recognition as a frontier user of High Performance Computing facilities capable of massively parallel processing.https://www.epj-conferences.org/articles/epjconf/pdf/2020/21/epjconf_chep2020_11002.pdf
spellingShingle Schellman Heidi
Computing for the DUNE Long-Baseline Neutrino Oscillation Experiment
EPJ Web of Conferences
title Computing for the DUNE Long-Baseline Neutrino Oscillation Experiment
title_full Computing for the DUNE Long-Baseline Neutrino Oscillation Experiment
title_fullStr Computing for the DUNE Long-Baseline Neutrino Oscillation Experiment
title_full_unstemmed Computing for the DUNE Long-Baseline Neutrino Oscillation Experiment
title_short Computing for the DUNE Long-Baseline Neutrino Oscillation Experiment
title_sort computing for the dune long baseline neutrino oscillation experiment
url https://www.epj-conferences.org/articles/epjconf/pdf/2020/21/epjconf_chep2020_11002.pdf
work_keys_str_mv AT schellmanheidi computingforthedunelongbaselineneutrinooscillationexperiment