Год выпуска: 2005 Автор: B. K. Kale Издательство: Страниц: 0 ISBN: 1842652192
Описание
Book DescriptionAfter a brief historical perspective, A First Course on Parametric Inference, discusses the basic concept of sufficient statistic and the classical approach based on minimum variance unbiased estimator. There is a separate chapter on simultaneous estimation of several parameters. Large sample theory of estimation, based on consistent asymptotically normal estimators obtained by method of moments, percentile and the method of maximum likelihood is also introduced. The tests of hypotheses for finite samples with classical Neyman ? Pearson theory is developed pointing out its connection with Bayesian approach. The hypotheses testing and confidence interval techniques are developed leading to likelihood ratio tests, score tests and testsbased on maximum likelihood estimators. New to the Second Edition: · A chapter on Nonparametric Statistical Inference giving tests based on empirical distribution function and some elementary tests based on ranks about the median...