LDR | | 00000nam u2200205 4500 |
001 | | 000000432726 |
005 | | 20200224134332 |
008 | | 200131s2019 ||||||||||||||||| ||eng d |
020 | |
▼a 9781085794572 |
035 | |
▼a (MiAaPQ)AAI13884200 |
040 | |
▼a MiAaPQ
▼c MiAaPQ
▼d 247004 |
082 | 0 |
▼a 001 |
100 | 1 |
▼a Giordano, Ryan. |
245 | 10 |
▼a On the Local Sensitivity of M-Estimation: Bayesian and Frequentist Applications. |
260 | |
▼a [S.l.]:
▼b University of California, Berkeley.,
▼c 2019. |
260 | 1 |
▼a Ann Arbor:
▼b ProQuest Dissertations & Theses,
▼c 2019. |
300 | |
▼a 213 p. |
500 | |
▼a Source: Dissertations Abstracts International, Volume: 81-04, Section: B. |
500 | |
▼a Advisor: Jordan, Michael I. |
502 | 1 |
▼a Thesis (Ph.D.)--University of California, Berkeley, 2019. |
506 | |
▼a This item must not be sold to any third party vendors. |
520 | |
▼a This thesis uses the local sensitivity of M-estimators to address a number of extant problems in Bayesian and frequentist statistics. First, by exploiting a duality from the Bayesian robustness literature between sensitivity and covariances, I provide significantly improved covariance estimates for mean field variational Bayes (MFVB) procedures at little extra computational cost. Prior to this work, applications of MFVB have arguably been limited to prediction problems rather than inference problems for lack of reliable uncertainty measures. Second, I provide practical finite-sample accuracy bounds for the "infinitesimal jackknife'' (IJ), a classical measure of local sensitivity to an empirical process. In doing so, I bridge a gap between classical IJ theory and recent machine learning practice, showing that stringent classical conditions for the consistency of the IJ can be relaxed for restricted but useful classes of weight vectors, such as those of leave-K-out cross validation. Finally, I provide techniques to quantify the sensitivity of the inferred number of clusters in Bayesian nonparametric (BNP) unsupervised clustering problems to the form of the Dirichlet process prior. By considering local sensitivity to be an approximation to global sensitivity rather than a measure of robustness per se, I provide tools with considerably improved ability to extrapolate to different priors. Because each of these diverse applications are based on the same formal technique---the Taylor series expansion of an M-estimator---this work captures in a unified way the computational difficulties associated with each, and I provide open-source tools in Python and R to assist in their computation. |
590 | |
▼a School code: 0028. |
650 | 4 |
▼a Statistics. |
650 | 4 |
▼a Artificial intelligence. |
690 | |
▼a 0463 |
690 | |
▼a 0800 |
710 | 20 |
▼a University of California, Berkeley.
▼b Statistics. |
773 | 0 |
▼t Dissertations Abstracts International
▼g 81-04B. |
773 | |
▼t Dissertation Abstract International |
790 | |
▼a 0028 |
791 | |
▼a Ph.D. |
792 | |
▼a 2019 |
793 | |
▼a English |
856 | 40 |
▼u http://www.riss.kr/pdu/ddodLink.do?id=T15491352
▼n KERIS
▼z 이 자료의 원문은 한국교육학술정보원에서 제공합니다. |
980 | |
▼a 202002
▼f 2020 |
990 | |
▼a ***1008102 |
991 | |
▼a E-BOOK |