Course Information for STOR 654 (Fall 2012)
Methods of Theoretical Statistics

 

Class meetings: Tuesday and Thursday 3:30 - 4:45, in Hanes 125.

Prerequisites: Elementary probability (see below for more details) and at least one semester of real analysis.

Instructor: Andrew B. Nobel, Department of Statistics and Operations Research
 
Office: Hanes 308   Phone: 962-1352.

Office Hours: Mondays 1:30pm-2:30pm

Teaching assistant: Guan Yu

Office: Hanes B40   Email: guanyu@live.unc.edu 

Grader's Office Hours:  Mondays and Wednesdays 11am-12am


Homework policy:   Homework problems will be assigned periodically throughout the semester. Each homework assignment will be graded: late/missed homeworks will receive a grade of zero. Students are welcome to discuss the homework problems with other members of the class, but should prepare their final answers on their own. If you have any questions concerning the grading of homework, please speak first with the TA. If you are absent from class when an assignment is returned, you can get your homework from the TA during their office hours.
 

Exams:   There will be one in-class midterm exam, and a comprehensive final examination.  Each will be closed book, closed notes.

Midterm 1

Tuesday 9 October

Final

The default time is that listed in the course directory. The exam will be in class.


Grading:  
Course grades will be calculated as follows:

Homework

15%

Midterm 1

35%

Final

50%



Required Text:
The lectures will (roughly) follow Chapters 6-9 of the book "Statistical Inference", Second Edition, by G. Casella and R.L. Berger, Duxbury, 2002.


Prerequisites:  Students should be familiar with basic undergraduate probability, including the topics listed below. 

  1. Basic properties of probablities, random variables.  Probability mass functions and probability density functions.  Cumulative distribution functions. 

  2. Binomial, geometric, hypergeometric, Poisson, and negative binomial distributions. 

  3. Gaussian, uniform, exponential, double exponential, gamma, chi-squared and beta distributions.  F and t distributions.

  4. Joint probability mass/density functions, independence.

  4. Expected values, moments, variance and covariance.

  5. Distributions of functions of a random variable.  General change of variables formula.

Prerequisites can be found in Casella and Berger: 1.1-6, 2.1-3, 3.1-3, 4.1-3, 4.5-6, 5.1-2.  (Note that these sections also contain some material that is not prerequisite for the course.)  Students may also wish to consult the textbook "A First Course in Probability", by Sheldon Ross, or the ``bootcamp'' lecture notes.


Syllabus:  
The course is intended to introduce students to some of the basic ideas and techniques of (non-asymptotic) statistical inference from a theoretical point of view. The course is not measure-theoretic, but we will be mathematically rigorous whenever possible. The following is an overview of the topics covered. 

  1. Order statistics, Stirling's formula, moment generating functions, convex functions and their basic properties

  2. Basic inequalities from analysis: Jensen, Holder, Cauchy-Schwartz,
association inequalities for monotone functions


  3. Basic probability inequalities: Markov, Chebyshev, and the Chernoff bound

  4. Elements of decision theory: admissability, Bayes and minimax procedures.

  5. Principles of data reduction: sufficiency and minimal sufficiency (from Chapter 6 of C-B)

  6. Introduction to point estimation: maximum likelihood, method of moments (from Chapter 7 of C-B)

  7. Introduction to hypothesis testing: likelihood ratio tests, Neyman-Pearson theorem (from Chapter 8 of C-B)

  8. Introduction to confidence intervals: inverting test statistics, pivotal quantities (from Chapter 9 of C-B)

  9. Exponential inequalities for sums of independent random variables: Hoeffding, Bennett, and Bernstein

  10. Elementary concentration: Efron-Stein inequality, martingale differences, and McDiarmid's inequality

  11. Conditional expectations as L_2 projections: definitions and basic properties.

  12. Expectations and variance-covariance matrices for random vectors

  13. The multivariate Gaussian distribution: definition and basic properties, conditional distributions, independence of linear and quadratic forms

  14. Representation of correlated Gaussian random variables, elementary Gaussian comparison results (Slepian's Lemma)




Recommended Texts: These other texts offer good coverage of some of the material in the course, at a more advanced mathematical level.


"Mathematical Statistics", Second Edition, by P.J. Bickel and K.A. Doksum, Prentice Hall, 2001.


"Theory of Point Estimation'', by E.L. Lehmann, Wadsworth, 1991.


"Mathematical Statistics", Second Edition, by J. Shao, Springer, 2003.



Other Texts of Potential Interest:

"Elements of Information Theory", by T. Cover and J. Thomas, Wiley, 1991.  

"Combinatorial Methods in Density Estimation", by L. Devroye and G. Lugosi, Springer, 2001.
(More on probability inequalities.  See also the nice lecture notes on concentration inequalities by Boucheron, Lugosi and Massart, available from Lugosi's web page.)

"Principles of Mathematical Analysis", Third Edition, by W. Rudin, McGraw Hill. (This book is a good reference for real analysis.)

"Linear Algebra and its Applications", Third Edition, by G. Strang, Saunders, 1988. (This book gives a good basic overview of linear algebra.)

"Asymptotic Statistics", by A.W. van der Vaart, Cambridge University Press, 2000.  (A good advanced text on theoretical statistics.)

"Linear Statistical Inference and its Applications", by C.R. Rao, Wiley, 1973.
 

"The Cauchy-Schwartz Master Class'', J.M. Steele, Cambridge, 2004.  (A very well written and insightful book on basic inequalities, which can be read with relatively little background.  Contains many interesting problems, with solutions.)

"Multivariate Analysis", by K.V. Mardia, J.T. Kent and J.M. Bibby, Academic Press, 1979.