Bohrer Featured Speakers

Bohrer Lecture: Andrew R. Barron, Yale University

Andrew Barron
Andrew R. Barron

BIO: Andrew R Barron is a Professor of Statistics and Data Science at Yale University, from 1992 to present. Prior to that he was a faculty member in Statistics and Electrical and Computer Engineering at the University of Illinois at Urbana Champaign.  He received his MS and PhD degrees from Stanford University in Electrical Engineering in 1985 under the direction of Tom Cover and a Bachelor's degree in the fields of Mathematical Science and Electrical Engineering from Rice University in 1981. Andrew Barron has served as a Secretary of the Board of Governors of the IEEE Information Theory Society  and several terms as an elected member of this Board.  He has been an associate editor of the IEEE Transactions on Information Theory and the Annals of Statistics. He is known for the generalization of the AEP to continuous-valued ergodic processes, for an information-theoretic Central Limit Theorem, for the index of resolvability and associated characterization of performance of Minimum Description Length estimators, for characterization of concentration of posteriors at parameters in the information support of the prior, for information-theoretic determination of minimax rates of function estimation, for an early unifying view of statistical learning networks, for early approximation and estimation bounds for artificial neural networks and recent extensions to deep learning, for greedy algorithms for training neural networks, for information-theoretic aggregation of least squares regressions, and for the formulation and proof of capacity-achieving sparse regression codes for Gaussian noise communication channels.

Talk Title: Information Theory in Statistical Learning: Foundations and a Modern Perspective

Talk Abstract: Information theory builds on and illuminates the foundational statistical principles of Bayes, Laplace, Gauss, and Fisher concerning likelihood and penalized likelihood.  In particular, it is identified in broad generality that when the penalty satisfies an information-theoretic property, then there is a natural one-sided empirical process control. Namely, the minimum difference between a log likelihood ratio plus penalty and a Bhattacharyya-Hellinger loss has non-negative expectation and stochastically exceeds a random variable tightly concentrated around zero. Valid penalties include L_1 penalties on parameters (in linear models and in deep learning) and traditional penalties based on the number of parameters, as arise in minimum description length (MDL) and Bayesian estimators.  This result has previously been developed in work with T. Cover and with J. Li, C. Huang, G. Cheang, and X. Luo (and applied by various scholars) to show that the statistical risk of MDL estimators is controlled by an index of resolvability, expressing optimal tradeoff between Kullback approximation error and complexity relative to sample size. 

Here we report another use of the same one-sided empirical process concentration bound. Namely, confidence bounds on the loss of statistical machine learning estimators.  For every estimator, and every information-theoretically valid penalty, with probability at least 1- delta, the Bhattacharyy-Hellinger loss is not more than the associated log likelihood ratio plus penalty plus the 1- delta quantile of a random variable tightly concentrated near 0.  In particular, this result can be used to illuminate the phenomenon of "benign overfitting."  Namely, if the log likelihood at an estimate is higher than the log likelihood at a target, then the loss is not more than the penalty plus the indicated small-valued quantile.  Consequently, if the L_1 norm of fitted parameters is small compared to the square root of the sample size divided by the log of the number of parameters, then there is high confidence in the associated generalization accuracy of the fit.

 

Wijsman Lecture: Jelena Bradic, University of California San Diego

 

Jelena Bradic
Jelena Bradic

BIO: Jelena Bradic received her Ph.D. in Operations Research and Financial Engineering from Princeton in Spring 2011 with a specialization in Statistics and Applied Probability under the direction of Jianqing Fan. Her research is in high dimensional statistics, stochastic optimization, asymptotic theory, robust statistics, functional genomics and biostatistics. She has received two teaching awards (for teaching assistants) at Princeton.

Talk Title: Exploring new venues in double robustness for treatment effects in dynamic settings

Talk Abstract: 

 

Past Bohrer Workshop Keynote Speakers

  • 1994 Mark Schervish, Carnegie Mellon University
  • 1995 Dan Naiman, Johns Hopkins University
  • 1997 Ross Leadbetter, University of North Carolina
  • 1998 Dennis Karney, University of Kansas
  • 1999 Erich Lehmann, University of California Berkeley
  • 2000 David Bartholomew, London School of Economics
  • 2001 Gary Koch, University of North Carolina
  • 2002 Robert Serfling, University of Texas Arlington
  • 2003 Peter Bickel, University of California Berkeley
  • 2004 Peter Imrey, Cleveland Clinic Foundation
  • 2005 John Marden, University of Illinois
  • 2006 Raymond Carroll, Texas A&M University
  • 2007 Mary Ellen Bock, Purdue University
  • 2008 Ker-Chau Li, UCLA
  • 2010 Zhiliang Ying, Columbia University
  • 2011 Minge Xie, Rutgers University
  • 2012 Xuming He, University of Michigan
  • 2013 Yuhong Yang, University of Minnesota
  • Sky Andrecheck, Cleveland Indians
  • 2014 Hua-Hua Chang, University of Illinois at Urbana-Champaign
  • 2015 Cun-Hui Zhang, Rutgers University
  • 2016 Lawrence Brown, University of Pennsylvania
  • 2017 Edward George, University of Pennsylvania
  • 2018 Regina Liu, Rutgers University
  • 2019 - 2020 Cancelled
  • 2021 Liza Levina, University of Michigan
  • 2022 Vijay Nair, University of Michigan

Past Wijsman Lecturers

  • 2014 Zhiliang Ying, Columbia University
  • 2015 John Lafferty, University of Chicago
  • 2016 Xihong Lin, Harvard University
  • 2017 Nancy Reid, University of Toronto
  • 2018 Michael Kosorok, University of North Carolina at Chapel Hill
  • 2019 - 2020 Cancelled
  • 2021 Dean Foster, Amazon
  • Dawn Woodard, Uber