[arXiv] BigDataFr recommends: Big Data in HEP: A comprehensive use case study

hepBigDataFr recommends: Big Data in HEP: A comprehensive use case study

[…] Experimental Particle Physics has been at the forefront of analyzing the worlds largest datasets for decades. The HEP community was the first to develop suitable software and computing tools for this task. In recent times, new toolkits and systems collectively called Big Data technologies have emerged to support the analysis of Petabyte and Exabyte datasets in industry. While the principles of data analysis in HEP have not changed (filtering and transforming experiment-specific data formats), these new technologies use different approaches and promise a fresh look at analysis of very large datasets and could potentially reduce the time-to-physics with increased interactivity. In this talk, we present an active LHC Run 2 analysis, searching for dark matter with the CMS detector, as a testbed for Big Data technologies. We directly compare the traditional NTuple-based analysis with an equivalent analysis using Apache Spark on the Hadoop ecosystem and beyond. In both cases, we start the analysis with the official experiment data formats and produce publication physics plots. We will discuss advantages and disadvantages of each approach and give an outlook on further studies needed. […]

Read more
Oliver Gutsche 1, Matteo Cremonesi1, Peter Elmer 2, Bo Jayatilaka 1, Jim Kowalkowski 1, Jim Pivarski 2, Saba Sehrish 1, Cristina Mantilla Surez 3, Alexey Svyatkovskiy 2, Nhan Tran 1
Source: https://www.technologyreview.com
1Fermi National Accelerator Laboratory
2Princeton University
3Fermi National Accelerator Laboratory now Johns Hopkins University

Laisser un commentaire