Foundations of statistics
Statistics models the collection, organization, analysis, interpretation, presentation of data, and is used to solve mathematical problems. Conclusions drawn from statistical analysis typically contain uncertainties, certainties or as they represent the probability of an event occurring. Statistics is fundamental to disciplines of science that involve predicting or classifying events based on a large set of data and is an integral part of fields such as machine learning, bioinformatics, genomics, and economics.
Statistics also encompasses the identification and study of statistical laws, which are statistical behaviors observed over a variety of datasets. One common example is the Pareto Principle, which states that roughly 80% of effects are the result of 20% of causes, and is sometimes abbreviated as the 80/20 rule.
Statistical inference addresses various issues, including Bayesian inference versus frequentist inference; the distinction between Fisher's "significance testing" and the Neyman-Pearson "hypothesis testing"; and whether the likelihood principle should be followed. Some of these issues have been subject to unresolved debate for up to two centuries.
Bandyopadhyay & Forster describe four statistical paradigms: classical statistics (or error statistics), Bayesian statistics, likelihood-based statistics, and the use of the Akaike Information Criterion as a statistical basis. More recently, Judea Pearl reintroduced a formal mathematics for attributing causality in statistical systems that addresses fundamental limitations of both Bayesian and Neyman-Pearson methods.