Nitis Mukhopadhyay – Probability and Statistical Inference
This gracefully organized textbook reveals the rigorous theory of probability and statistical inference in the style of a tutorial, using worked examples, exercises, numerous figures and tables, and computer simulations to develop and illustrate concepts.
Beginning with an introduction to the basic ideas and techniques in probability theory and progressing to more rigorous topics, Probability, and Statistical Inference studies the Helmert transformation for normal distributions and the waiting time between failures for exponential distributions
develops notions of convergence in probability and distribution
1. spotlights the central limit theorem (CLT) for the sample variance
2. introduces sampling distributions and the Cornish-Fisher expansions
3. concentrates on the fundamentals of sufficiency, information, completeness, and ancillary
4. explains Basu’s Theorem as well as location, scale, and location-scale families of distributions
5. covers moment estimators, maximum likelihood estimators (MLE), Rao-Blackwellization, and the Cramér-Rao inequality
6. discusses uniformly minimum variance unbiased estimators (UMVUE) and Lehmann-Scheffé Theorems
7. focuses on the Neyman-Pearson theory of most powerful (MP) and uniformly most powerful (UMP) tests of hypotheses, as well as confidence intervals
8. includes the likelihood ratio (LR) tests for the mean, variance, and correlation coefficient
9. summarizes Bayesian methods
10. describes the monotone likelihood ratio (MLR) property
11. handles variance stabilizing transformations
12. provides a historical context for statistics and statistical discoveries
13. showcases great statisticians through biographical notes
Reviews
There are no reviews yet.