Department of Mathematics

Applied Mathematics

  •  Fanny Yang, ETH Zurich
  •  Fast rates for noisy interpolation require rethinking the effects of inductive bias
  •  06/09/2022
  •  4:30 AM - 4:30 AM
  •  Online (virtual meeting) (Virtual Meeting Link)
  •  Olga Turanova (turanova@msu.edu)

Modern machine learning has uncovered an interesting observation: large over parameterized models can achieve good generalization performance despite interpolating noisy training data. In this talk, we study high-dimensional linear models and show how interpolators can achieve fast statistical rates when their structural bias is moderate. More concretely, while minimum-l2-norm interpolators cannot recover the signal in high dimensions, minimum-l1-interpolators with strong sparsity bias are much more sensitive to noise. In fact, we show that even though they are asymptotically consistent, minimum-l1-norm interpolators converge with a logarithmic rate much slower than the O(1/n) rate of regularized estimators. In contrast, minimum-lp-norm interpolators with 1<p<2 can trade off these two competing trends to yield polynomial rates close to O(1/n).

 

Contact

Department of Mathematics
Michigan State University
619 Red Cedar Road
C212 Wells Hall
East Lansing, MI 48824

Phone: (517) 353-0844
Fax: (517) 432-1562

College of Natural Science