Department of Mathematics

Applied Mathematics

  •  Rene Vidal, Johns Hopkins University
  •  On the Regularization Properties of Structured Dropout; zoom link @ https://sites.google.com/view/minds-seminar/home
  •  10/01/2020
  •  2:30 PM - 3:30 PM
  •  Online (virtual meeting)

Dropout and its extensions (e.g. DropBlock and DropConnect) are popular heuristics for training neural networks, which have been shown to improve generalization performance in practice. However, a theoretical understanding of their optimization and regularization properties remains elusive. This talk will present a theoretical analysis of several dropout-like regularization strategies, all of which can be understood as stochastic gradient descent methods for minimizing a certain regularized loss. In the case of single hidden-layer linear networks, we will show that Dropout and DropBlock induce nuclear norm and spectral k-support norm regularization, respectively, which promote solutions that are low-rank and balanced (i.e. have factors with equal norm). We will also show that the global minimizer for Dropout and DropBlock can be computed in closed form, and that DropConnect is equivalent to Dropout. We will then show that some of these results can be extended to a general class of Dropout-strategies, and, with some assumptions, to deep non-linear networks when Dropout is applied to the last layer.

 

Contact

Department of Mathematics
Michigan State University
619 Red Cedar Road
C212 Wells Hall
East Lansing, MI 48824

Phone: (517) 353-0844
Fax: (517) 432-1562

College of Natural Science