A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713-1935

This is a history of parametric statistical inference, written by one of the most important historians of statistics of the 20th century, Anders Hald. This book can be viewed as a follow-up to his two most recent books, although this current text is much more streamlined and contains new analysis of...

Full description

Bibliographic Details
Main Author: Hald, Anders
Format: eBook
Language:English
Published: New York, NY Springer New York 2007, 2007
Edition:1st ed. 2007
Series:Sources and Studies in the History of Mathematics and Physical Sciences
Subjects:
Online Access:
Collection: Springer eBooks 2005- - Collection details see MPG.ReNa
LEADER 04154nmm a2200337 u 4500
001 EB000355352
003 EBX01000000000000000208404
005 00000000000000.0
007 cr|||||||||||||||||||||
008 130626 ||| eng
020 |a 9780387464091 
100 1 |a Hald, Anders 
245 0 0 |a A History of Parametric Statistical Inference from Bernoulli to Fisher, 1713-1935  |h Elektronische Ressource  |c by Anders Hald 
250 |a 1st ed. 2007 
260 |a New York, NY  |b Springer New York  |c 2007, 2007 
300 |a XIV, 226 p. 11 illus  |b online resource 
505 0 |a The Three Revolutions in Parametric Statistical Inference -- The Three Revolutions in Parametric Statistical Inference -- Binomial Statistical Inference -- James Bernoulli’s Law of Large Numbers for the Binomial, 1713, and Its Generalization -- De Moivre’s Normal Approximation to the Binomial, 1733, and Its Generalization -- Bayes’s Posterior Distribution of the Binomial Parameter and His Rule for Inductive Inference, 1764 -- Statistical Inference by Inverse Probability -- Laplace’s Theory of Inverse Probability, 1774–1786 -- A Nonprobabilistic Interlude: The Fitting of Equations to Data, 1750–1805 -- Gauss’s Derivation of the Normal Distribution and the Method of Least Squares, 1809 -- Credibility and Confidence Intervals by Laplace and Gauss -- The Multivariate Posterior Distribution -- Edgeworth’s Genuine Inverse Method and the Equivalence of Inverse and Direct Probability in Large Samples, 1908 and 1909 -- Criticisms of Inverse Probability -- The Central Limit Theorem and Linear Minimum Variance Estimation by Laplace and Gauss -- Laplace’s Central Limit Theorem and Linear Minimum Variance Estimation -- Gauss’s Theory of Linear Minimum Variance Estimation -- Error Theory. Skew Distributions. Correlation. Sampling Distributions -- The Development of a Frequentist Error Theory -- Skew Distributions and the Method of Moments -- Normal Correlation and Regression -- Sampling Distributions Under Normality, 1876–1908 -- The Fisherian Revolution, 1912–1935 -- Fisher’s Early Papers, 1912–1921 -- The Revolutionary Paper, 1922 -- Studentization, the F Distribution, and the Analysis of Variance, 1922–1925 -- The Likelihood Function, Ancillarity, and Conditional Inference 
653 |a Statistical Theory and Methods 
653 |a Statistics  
653 |a Probability Theory 
653 |a History 
653 |a Mathematics 
653 |a Probabilities 
653 |a History of Mathematical Sciences 
041 0 7 |a eng  |2 ISO 639-2 
989 |b Springer  |a Springer eBooks 2005- 
490 0 |a Sources and Studies in the History of Mathematics and Physical Sciences 
028 5 0 |a 10.1007/978-0-387-46409-1 
856 4 0 |u https://doi.org/10.1007/978-0-387-46409-1?nosfx=y  |x Verlag  |3 Volltext 
082 0 |a 519.2 
520 |a This is a history of parametric statistical inference, written by one of the most important historians of statistics of the 20th century, Anders Hald. This book can be viewed as a follow-up to his two most recent books, although this current text is much more streamlined and contains new analysis of many ideas and developments. And unlike his other books, which were encyclopedic by nature, this book can be used for a course on the topic, the only prerequisites being a basic course in probability and statistics. The book is divided into five main sections: * Binomial statistical inference; * Statistical inference by inverse probability; * The central limit theorem and linear minimum variance estimation by Laplace and Gauss; * Error theory, skew distributions, correlation, sampling distributions; * The Fisherian Revolution, 1912-1935. Throughout each of the chapters, the author provides lively biographical sketches of many of the main characters, including Laplace, Gauss, Edgeworth, Fisher, and Karl Pearson. He also examines the roles played by DeMoivre, James Bernoulli, and Lagrange, and he provides an accessible exposition of the work of R.A. Fisher. This book will be of interest to statisticians, mathematicians, undergraduate and graduate students, and historians of science