Interdisciplinary Workshop on Statistical and Analysis Methods in Nuclear, Particle and Astrophysics

Europe/Berlin
ECT*, European Centre for Theoretical Studies in Nuclear Physics and Related Areas, Villazzano, Italy

ECT*, European Centre for Theoretical Studies in Nuclear Physics and Related Areas, Villazzano, Italy

Strada delle Tabarelle 286, I-38123 Villazzano (TN) Italy
Andreas Müller (Excellence Cluster Universe)
Description
The goal of the meeting on "Statistical and Analysis Methods in Nuclear, Particle and Astrophysics" is to bring together experts on these methods from different communities. In this way, we would like to foster new cooperations and projects.

The three-days workshop involves a introductory session for students at the beginning followed by three topical presentation sessions on astronomy & cosmology, particle physics, and astroparticle physics and. Each topical session will be followed by students sessions and a discussion session.

The venue is ECT*, the European Center for Theoretical studies in Nuclear Physic and Related Areas in Villazzano, Italy. 

The participants belong to groups from the partner institutions of the Excellence Cluster Universe in Garching/Munich, i.e. the Technische Universität München, the Ludwig-Maximilians-Universität, four Max-Planck Institutes for Physics, Extraterrestrial Physics, Astrophysics and Plasmaphysics, the European Southern Observatory, and the Computer Centre Leibniz-Rechenzentrum. Groups from ECT* are welcome to join as well as a contigent of people from all over the world.

Registration is closed! 
http://www.ectstar.eu/node/781

Confirmed Speakers:
Chiara Arina (Institute d'Astrophysique de Paris, IAP)
Frederik Beaujean (LMU)
Michael Betancourt (University of Warwick)
Steve Biller (U Oxford)
Kyle Cranmer (New York U)
Fabrizia Guglielmetti (MPE)
Jens Jasche (TUM)
Balazs Kegl (LAL Orsay)
Franz Proebst (MPP)
Benjamin Wandelt (Institute Lagrange de Paris)  

Scientific Organising Committee:
Frederik Beaujean (LMU)
Hans Böhringer (MPE)
Allen Caldwell (MPP)
Torsten Enßlin (MPA)
Boris Grube (TUM)
Fabrizia Guglielmetti (MPE)
Andreas Müller (TUM), Chair

Schedule and abstract booklet: Please see the links on PDFs below.

This event is organised and funded by the Excellence Cluster Universe and ECT* in Trento.
Abstract booklet
Poster
Schedule (final version)
    • Registration
    • 12:15
      Lunch break
    • Introductory Lectures
    • 1
      Motivation
      Which way a coin tossed in air will fall may be completely determined by laws of physics. Prediction of the trajectory, in order to get the face of the coin when it is on the ground, depends on many parameters (e.g. angular momentum of rotation, force at the time of the toss, wind pressure at various instants during the rotation of the coin). Its modelling will require incorporating a random structure and its outcome is uncertain. Going beyond gambling, we often come across events whose outcome is uncertain and the evaluation of experimental data is quite complex. The uncertainty could be because of our inability to observe accurately all the inputs required to compute the outcome. It may be too expensive or even counterproductive to observe all the inputs. The uncertainty could be due to the current level of understanding of the phenomenon. Quantification of the uncertainty is one of the most obsessive problems in the evaluation of experimental data. After an introduction to probability theory, a summary of Bayesian and Frequentist principles is given. The material is supported with examples.
      Speaker: Fabrizia Guglielmetti
    • 2
      Probabilty theory
      Which way a coin tossed in air will fall may be completely determined by laws of physics. Prediction of the trajectory, in order to get the face of the coin when it is on the ground, depends on many parameters (e.g. angular momentum of rotation, force at the time of the toss, wind pressure at various instants during the rotation of the coin). Its modelling will require incorporating a random structure and its outcome is uncertain. Going beyond gambling, we often come across events whose outcome is uncertain and the evaluation of experimental data is quite complex. The uncertainty could be because of our inability to observe accurately all the inputs required to compute the outcome. It may be too expensive or even counterproductive to observe all the inputs. The uncertainty could be due to the current level of understanding of the phenomenon. Quantification of the uncertainty is one of the most obsessive problems in the evaluation of experimental data. After an introduction to probability theory, a summary of Bayesian and Frequentist principles is given. The material is supported with examples.
      Speaker: Fabrizia Guglielmetti
    • 15:15
      Coffee break
    • 3
      Bayesian methods
      Which way a coin tossed in air will fall may be completely determined by laws of physics. Prediction of the trajectory, in order to get the face of the coin when it is on the ground, depends on many parameters (e.g. angular momentum of rotation, force at the time of the toss, wind pressure at various instants during the rotation of the coin). Its modelling will require incorporating a random structure and its outcome is uncertain. Going beyond gambling, we often come across events whose outcome is uncertain and the evaluation of experimental data is quite complex. The uncertainty could be because of our inability to observe accurately all the inputs required to compute the outcome. It may be too expensive or even counterproductive to observe all the inputs. The uncertainty could be due to the current level of understanding of the phenomenon. Quantification of the uncertainty is one of the most obsessive problems in the evaluation of experimental data. After an introduction to probability theory, a summary of Bayesian and Frequentist principles is given. The material is supported with examples.
      Speaker: Fabrizia Guglielmetti
    • 4
      Frequentist methods
      Which way a coin tossed in air will fall may be completely determined by laws of physics. Prediction of the trajectory, in order to get the face of the coin when it is on the ground, depends on many parameters (e.g. angular momentum of rotation, force at the time of the toss, wind pressure at various instants during the rotation of the coin). Its modelling will require incorporating a random structure and its outcome is uncertain. Going beyond gambling, we often come across events whose outcome is uncertain and the evaluation of experimental data is quite complex. The uncertainty could be because of our inability to observe accurately all the inputs required to compute the outcome. It may be too expensive or even counterproductive to observe all the inputs. The uncertainty could be due to the current level of understanding of the phenomenon. Quantification of the uncertainty is one of the most obsessive problems in the evaluation of experimental data. After an introduction to probability theory, a summary of Bayesian and Frequentist principles is given. The material is supported with examples.
      Speaker: Fabrizia Guglielmetti
    • 5
      Discussion
    • 19:30
      Dinner
    • Astronomy & Cosmology
      • 6
        Introduction to methods in cosmology (tbc)
        Speaker: Benjamin Wandelt
      • 7
        Galaxy catalogues (tbd)
        Speaker: Jens Jasche
      • 8
        D3PO code
        Speaker: Fabrizia Guglielmetti
      • 10:45
        Coffee break
      • 9
        Students' talks
    • 10
      Bayesian methods, not only for astronomical images
      Astronomical images are frequently difficult to analyse because images consist of a diffuse background with superposed celestial objects and corrupted by effects due to instrumental complexity. Previous methods, e.g. sliding windows and wavelet based techniques, suffer from describing large variations in the background, detection of faint and extended sources and sources with complex morphologies. Large systematic errors in object photometry and loss of faint sources may occur with these techniques. In this talk, two forward methods (BSS and D3PO) capable to identify automatically point sources, diffuse emissions, also when on the same line of sight, are described. The BSS technique (Guglielmetti, F. et al. 2009) is based on Bayesian mixture models, while D3PO (Selig, M. & Ensslin, T. 2013) is based on Information Field Theory (http://www.mpa-garching.mpg.de/ift/). Both technique are applied on images at high frequencies (energies > 0.1 keV) of the electromagnetic spectrum. Noise dominates the signal especially at these frequencies.
      Speaker: Fabrizia Guglielmetti
      Slides
    • 11
      Bayesian methods, not only for astronomical images (part II)
      Speaker: Torsten Ensslin (MPA)
      Slides
    • 12
      4D Bayesian inference with large scale structure surveys
      Presently proposed and designed future cosmological probes and surveys permit us to anticipate the upcoming avalanche of cosmological information during the next decades. The increase of valuable observations needs to be accompanied with the development of efficient and accurate information processing technology in order to analyse and interpret this data. Besides traditional systematics and uncertainties such as survey geometries and observational noise, modern data analysis needs to account for the complex statistical properties of gravitationally evolved matter fields and also has to provide corresponding uncertainty quantification. The analysis of the structure and evolution of our inhomogeneous Universe therefore requires to solve non-linear statistical inference problems in very high dimensional parameter spaces, involving on the order of 10^7 or more parameters. For these reasons, in this talk Jens will address the problem of high dimensional Bayesian inference from cosmological data sets via the recently proposed BORG algorithm. This method couples an approximate model of structure formation to an Hybrid Monte Carlo algorithm providing a fully probabilistic, physical model of the non-linearly evolved density field as probed by galaxy surveys. Besides highly accurate and detailed measurements of three dimensional cosmic density and velocity fields, this methodology also infers plausible formation histories for the observed large scale structure. In this talk he will give an overview over this promising path towards Bayesian chrono-cosmography, the subject of inferring the four dimensional state of our Universe from observations, and show first results from data applications.
      Speaker: Jens Jasche
      Slides
    • 13
      Statistical Methods in Cosmology
      Cosmological statistics is full of contradictions: we are both awash in data,... and yet there are fundamental limits to the information we can obtain. We are working on solid foundations that allow us to formulate powerful, well-motivated, simple physical models and priors,... and yet non-linear effects on small scales block our access to an enormous treasure trove of modes. Like our particle physics colleagues we test fundamental physics; but unlike particle physics, cosmology is, for the most part, an observational, not a laboratory science so we often do not control our experiments. Ben will discuss a number of examples showing how cosmologists solve statistics problems to answer questions such as: 'How did the Universe begin? What is it made of? How did come to look the way it does?'
      Speaker: Benjamin Wandelt
      Slides
    • 10:45
      Coffee break
    • 14
      Students' session
    • 12:15
      Lunch break
    • Particle physics
    • 15
      The frontier of analysis methods for particle physics
      Kyle will review the current statistical techniques being used for new physics searches at the LHC. He will identify weak points and challenges of the current methods and the efforts underway to improve upon them.
      Speaker: Kyle Cranmer
    • 16
      A Platform for Efficient Bayesian Inference
      At the frontier of applied statistics, Bayesian inference requires not only the building of a probabilistic model but also the efficient implementation of both the model and an algorithm capable of fitting the model. Stan is a platform for Bayesian inference that avoids bespoke inference by providing a powerful probabilistic programming language for the specification of models and a high-performance implementation of Hamiltonian Monte Carlo for fitting those models, ultimately allowing users to focus their efforts on model building and validation. In this talk Michael will review these features and discuss some recent applications in physics.
      Speaker: Michael Betancourt
      Slides
    • 17
      Multivariate analysis techniques (tbc)
      Speaker: Balazs Kegl
    • 15:45
      Coffee break
    • 18
      The Confidence Game
      The behaviors of confidence/credible interval constructions are explored, particularly in the region of low statistics where methods diverge most. A number of challenges are highlighted, such as the treatment of nuisance parameters, and common misconceptions associated with such constructions. An informal survey of the literature suggests that confidence intervals are not always defined in relevant ways and are too often misinterpreted and/or misapplied. This can lead to seemingly paradoxical behaviours and flawed comparisons regarding the relevance of experimental results. Examples are drawn from a variety of experiments. Therefore, a more pragmatic strategy is warranted which recognises that, while it is critical to objectively convey the information content of the data, there is also a strong desire to derive bounds on model parameter values and a natural instinct to interpret things this way. Accordingly, an attempt is made to put aside philosophical biases in favour of a practical view to propose a more transparent and self-consistent approach that better addresses these issues.
      Speaker: Steve Biller
      Slides
    • 19
      Students' talks
    • 20
      Discussion
    • 19:30
      Dinner
    • 21
      The Confidence Game
      The behaviors of confidence/credible interval constructions are explored, particularly in the region of low statistics where methods diverge most. A number of challenges are highlighted, such as the treatment of nuisance parameters, and common misconceptions associated with such constructions. An informal survey of the literature suggests that confidence intervals are not always defined in relevant ways and are too often misinterpreted and/or misapplied. This can lead to seemingly paradoxical behaviours and flawed comparisons regarding the relevance of experimental results. Examples are drawn from a variety of experiments. Therefore, a more pragmatic strategy is warranted which recognises that, while it is critical to objectively convey the information content of the data, there is also a strong desire to derive bounds on model parameter values and a natural instinct to interpret things this way. Accordingly, an attempt is made to put aside philosophical biases in favour of a practical view to propose a more transparent and self-consistent approach that better addresses these issues.
      Speaker: Steve Biller
    • Astroparticle physics
      • 22
        Another look on confidence intervals (tbc)
        Speaker: Steve Biller
      • 23
        CRESST experiment and frequentist analysis
        Speaker: Franz Pröbst
      • 24
        Bayesian analysis of direct dark matter detection experiments
        Speaker: Chiara Arina
      • 10:45
        Coffee break
      • 25
        Students' talks
      • 26
        Discussion
    • 27
      Variational Bayes: a versatile workhorse
      Fred will provide a pedagogical introduction to the variational Bayes framework. While it is typically used for clustering, he will show a new algorithm to sample from and integrate multimodal functions that makes use of all the goodness of variational Bayes. This algorithm is of immediate use to many particle-physics analyses, particularly in B physics.
      Speaker: Frederik Beaujean
      Slides
    • 28
      CRESST experiment and analysis
      The CRESST experiment searches for dark matter with scintillating CaWO4 crystals operated as cryogenic calorimeters at labout 15 mK, and employing a simultaneous measurement of a phonon and a scintillation signal. The signal of the phonon channel serves for a precise measurement of the energy deposited in the crystal, while the ratio of scintillation light to deposited energy is used to discriminate different types of interacting particles and thus to distinguish possible nuclear recoil signal events from the dominant backgrounds. In the talk we will discuss essential steps and methods used in the CRESST analysis, from the digitized pulses over the energy calibration and signal selection to the Dark Matter sensitivity curves. The statistical method for deriving the upper limit curves, nowadays adopted by most dark matter experiments will be discussed.
      Speaker: Franz Proebst
    • 29
      Bayesian analysis of direct dark matter detection experiments
      Bayesian statistical methods offer a simple and consistent framework for incorporating uncertainties into a multi-parameter inference problem. In this talk we apply these methods to the most recent direct dark matter experimental results. We consider the simplest scenarios of spin-independent WIMP scattering, and infer the WIMP mass and cross-section from the data with the essential systematic uncertainties folded into the analysis. In the same vein, we investigate the impact of astrophysical uncertainties on the preferred WIMP parameters. We then discuss and interpret the results to quantitatively estimate the disagreement between the controversial hints of detection claimed by CDMS-Si, DAMA, CoGeNT and CRESST with respect the upper bounds of XENON100 and LUX. We conclude with a positive example of interaction which reconciles the controversial claims with the most stringent exclusion bounds.
      Speaker: Chiara Arina
      Slides
    • 10:45
      Coffee break
    • 30
      Students' session
    • 12:15
      Lunch break