To be(t) or not to be(t): A Bayesian approach to statistical data analysis

The process of learning from observation is the founding step of Science. When it goes in the direction of collecting empyrical observations and getting to general rules it is called “induction”, and it has the goal to infer from the effects of a given phenomenon its causes. The opposite process, co...

Full description

Saved in:
Bibliographic Details
Main Author: Pisano Silvia
Format: Article
Language:English
Published: EDP Sciences 2025-01-01
Series:EPJ Web of Conferences
Online Access:https://www.epj-conferences.org/articles/epjconf/pdf/2025/16/epjconf_essena2025_01005.pdf
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:The process of learning from observation is the founding step of Science. When it goes in the direction of collecting empyrical observations and getting to general rules it is called “induction”, and it has the goal to infer from the effects of a given phenomenon its causes. The opposite process, consisting in going from the causes to the effects, is called “deduction”. Both of these approaches suffer the fact that any observation has an associated uncertainty due to the finite resolution of any measurement. This leads to the question, fundamental for a quantitative analysis of the universe, of what is the real value of a measured quantity, if the measurement is affected by a given experimental uncertainty. Statistics aims at disciplining the process of answering to the latter question. However, in classical statistics the information one gets is the probability of the effects assuming a given cause, e.g. the probability of observing a given data set if a specific value for a parameter is assumed (useful when comparing, for example, parameters estimated through different models). On the maximization of the latter is based, e.g., the Likelihood method. The goal is, though, to have a more general probability, where also the probability of the causes is taken into account. This leads to the Bayes theorem and to the Bayesian statistics, which provides a framework based on the concept of prior believes and on the definition of conditional probability, that is the probability that an event will occur under the hypothesis that another event has occurred. In this approach, the roles played by the different elements entering into the definition of probability - priors, posteriors, Likelihood - are properly organized so to identify the role of the previous knowledge about a given phenomenon, how it gets updated by any new measurement and how the process can be done recursively so to exploit any new the information on the relevant quantities inspected in a given measurements.
ISSN:2100-014X