Subject: STA 251
Title: Topics in Statistical Methods and Models
Untis: 4.0
School: College of Letters and Science LS
Department: Statistics STA
Effective Term: 2002 Fall
- General Description
-
Learning Activities
Lecture - 3.0 hours
Discussion - 1.0 hours
Description
Topics may include Bayesian analysis, nonparametric and semiparametric regression, sequential analysis, bootstrap, statistical methods in high dimensions, reliability, spatial processes, inference for stochastic process, stochastic methods in finance, empirical processes, change-point problems, asymptotics for parametric, nonparametric and semiparametric models, nonlinear time series, robustness. May be repeated for credit if topics differ; only with consent of the graduate advisor.
Prerequisites
STA 231B; Or the equivalent of STA 231B.
Repeat Credit
May be repeated for credit if topics differ; only with consent of the graduate advisor.
Expanded Course Description
Summary of Course Content:
Topics will be chosen from the following areas depending on faculty and student interests in any particular offering: Bayesian analysis, regression analysis-parametric and nonparametric, sequential analysis, survival analysis, theory of bootstrap, statistical methods in high dimensions, reliability theory, spatial processes, inference for stochastic processes, stochastic models in finance, empirical processes, change-point problems, asymptotic theory for parametric, nonparametric and semiparametric models, advanced nonlinear time series, and robustness.Illustrative Reading:
Depending on the particular topics to be discussed in a given quarter, there will be a number of standard book and monograph references, but mostly research journal articles.Potential Course Overlap:
Some brief passages on some of the topics outlined under Topical Outline are made in Statistics 231 B-C. Potentially, there could be partial overlap with Statistics 250. The variety of topics, however, in both courses, Statistics 250 and 251, is so wide that any overlap may be avoided completely. Indeed, a list of topics to be discussed or which have been discussed in these courses will be kept on file, so that no overlap occurs at all.
- Example: Fall 2015, Bootstrap Methods
- Prerequisites: This course expects a background in probability theory and mathematical statistics equivalent to STA 231AB. Specic tools needed to understand the theory include: laws of large numbers for triangular arrays, central limit theorems for triangular arrays, and basic weak convergence theory for random elements in function spaces. The ability to program in R, or equivalent computing environment, is critical for using bootstrap methods in class assignments.
Course Sketch: The course will treat signicant parts of the theory and practice of boot-strap methods. Several of the Instructor's papers will serve as sources for theory. The textbook will serve as a source for algorithms and R-code. Some additional references are listed below. Topics covered include:
-The bootstrap world.
-Denitions of \the bootstrap works". Consistency? Higher order accuracy? Something else?
-Bootstrap condence sets and bootstrap tests.
-Suffcient conditions under which the bootstrap works as sample size increases.
-Bootstrap applications to problems that foiled classical statistical theory.
-Necessary and sucient conditions under which the bootstrap works.
-Nonclassical bootstrap methods for modern regularized estimators.
Textbook: Davison, A. C. and D. V. Hinkley. Bootstrap Methods and their Application. (1997). Cambridge.
Pertinent Papers by the Instructor: You may access most of these papers from a campus IP address, using https://scholar.google.com or http://www.ams.org/mathscinet/ to locate links.
1. Bootstrap methods in statistics. Jber. d. Dt. Math Verein. 86 (1984) 14{30.
2. (with M. S. Srivastava) Bootstrap tests and condence regions for functions of a covariance matrix. Ann. Statist. 13 (1985) 95{115. Correction note in 15 (1987) 470{471.
3. Simulated power functions. Ann. Statist. 14 (1986) 151{173.
4. (with P. W. Millar) Condence sets for a multivariate distribution. Ann. Statist. 14 (1986) 431-443.
5. Prepivoting to reduce level error of condence sets. Biometrika 74 (1987) 457{468.
6. Balanced simultaneous condence sets. J. Amer. Statist. Assoc. 83 (1988) 679{686.
7. Prepivoting test statistics: a bootstrap view of the Behrens-Fisher problem, the Bartlett
adjustment, and nonparametric analogs. J. Amer. Statist. Assoc. 83 (1988) 687{697.
8. Stein condence sets and the bootstrap. Statistica Sinica 5 (1994) 109{127.
9. Diagnosing bootstrap success. Ann. Inst. Statist. Math. 49 (1997) 1{24.
10. The impact of the bootstrap on statistical algorithms and theory. Statistical Science 18 (2003) 175{184.
Additional Bootstrap References:
A1. Beran, R. and G. R. Ducharme. Asymptotic Theory for Bootstrap Methods in Statistics. (1991). Publications CRM, Universite de Montreal.
A2. Hall. P. The Bootstrap and Edgeworth Expansion. (1992). Springer.
A3. Efron, B. and R. J. Tibshirani. An Introduction to the Bootstrap. (1993). Chapman and Hall.
A4. Statistical Science 18, issue 2 (2003). Features papers reviewing the bootstrap on its 25th anniversary.
A5. Canty, A. J., A. C. Davison, D. V. Hinkley and V. Ventura. Bootstrap diagnostics and remedies. Can. J. Statistics. 34 (2006) 5{27.
Software: The R library boot gives the current version of the textbook's bootstrap code. The site http://cran.r-project.org oers binaries and documentation for R. The Art of R Programming by Norman Matlo (2011) describes the use of R as a programming language.
Grading: The course grade will be based on a theory midterm and on a nal project involving both theory and computation. Assignments during the course will prepare you for
these. - Example: Winter 2015, Concentration Inequalities and High-Dimensional Statistics
- Basic concentration inequalities. Uniform laws of large numbers. Metric entropy and suprema of stochastic processes. Concentration of measure. Random matrices and covariance estimation. Sparse linear models in high dimensions. Bootstrap methods in high dimensions.
- Example: Spring 2015, High-dimensional Statistical Inference
- Topics:
-Examples of high-dimensional problems : Problems in image processing, bioinformatics, wireless communication, econometrics and nance, atmospheric science.
-Elements of classical multivariate analysis : Normal and Wishart distributions; distribution of eigenvectors and eigenvalues of a Wishart matrix. Elements of xed p, large n asymptotic theory.
-High-dimensional regression : Effects of dimensionality on estimation of regression co-efficients; Sparse estimation, with emphasis on l1-penalized (lasso-type) estimators.
-Estimation of covariance and precision matrix : Methods for regularized estimation of covariance and precision matrices, including tapering, thresholding and sparse penalization.
-Elements of random matrix theory : Asymptotic regime when p=n ! 2 (0;1) as
n ! 1. Limiting empirical distribution of the eigenvalues of sample covariance matrix. Central limit theorems of linear spectral statistics. Applications.
-High dimensional PCA: Sparse principal component analysis; Sparse canonical correlation analysis.
-Miscellaneous topics (time permitting) : Random projections; Large random graphs;Clustering and classication of high-dimensional data.
Grading : Grades will be based on the following criteria: (a) Course participation, which involves regular attendance to the lectures, participation in the discussions and solving occasional assignments; and (b) A short project (computational or theoretical) on a small problem chosen in consultation with the instructor. Students can work on the projects in groups of up to 2. Evaluation of the projects will be based on the project report and an in class presentation at the end of the quarter.