Speaker: Evzenie Coupkova, PurdueTitle: Unique generalization properties of a dense set of classifiers based on one-dimensional random projectionsDate: 10/17/2022Time: 1:00 PM - 2:00 PMPlace: C117 Wells Hall (Virtual Meeting Link)Contact: Craig Gross (grosscra@msu.edu)

The generalization error of a classifier is related to the complexity of the set of functions among which the classifier is chosen. We study a family of low-complexity classifiers consisting of thresholding a random one-dimensional feature. The feature is obtained by projecting the data on a random line after embedding it into a higher-dimensional space parametrized by monomials of order up to k. More specifically, the extended data is projected n-times and the best classifier among those n, based on its performance on training data, is chosen. We show that this type of classifier is extremely flexible, as it is likely to approximate, to an arbitrary precision, any continuous function on a compact set as well as any Boolean function on a compact set that splits the support into measurable subsets. In particular, given full knowledge of the class conditional densities, the error of these low-complexity classifiers would converge to the optimal (Bayes) error as k and n go to infinity. On the other hand, if only a training dataset is given, we show that the classifiers will perfectly classify all the training points as k and n go to infinity. We also bound the generalization error of our random classifiers. In general, our bounds are better than those for any classifier with VC dimension greater than O (ln n) . In particular, our bounds imply that, unless the number of projections n is extremely large, there is a significant advantageous gap between the generalization error of the random projection approach and that of a linear classifier in the extended space. Asymptotically, as the number of samples approaches infinity, the gap persists for any such n. Thus, there is a potentially large gain in generalization properties by selecting parameters at random, rather than optimization. $\\$ A preprint of this work can be found here: https://arxiv.org/abs/2108.06339 $\\$ This will be a hybrid seminar and take place in C117 Wells Hall and via Zoom at https://msu.zoom.us/j/99426648081?pwd=ZEljM3BPUXg2MjVUMVM5TnlzK2NQZz09 .

Department of Mathematics

Michigan State University

619 Red Cedar Road

C212 Wells Hall

East Lansing, MI 48824

Phone: (517) 353-0844

Fax: (517) 432-1562

- People
- All
- Regular Faculty
- Postdocs
- Fixed Term/Visiting Faculty
- Specialists and Instructors
- Adjunct Faculty
- Emeriti
- Graduate Students
- Teaching Assistants
- Staff
- Administration
- Diversity, Equity, Inclusiveness
- Faculty Honors

- Research
- Faculty Research Interests
- Seminars
- Seminars by Week
- Topology RTG
- MCIAM
- MathSciNet
- Institute of Mathematical Physics
- Math Library
- Phillips Lecture

- Undergraduate
- Undergraduate Program
- Class Pages
- Student Portal
- Webwork
- Math Learning Center
- Actuarial Science
- Advising Information
- Override Request
- Math Placement Service
- Herzog Competition
- Scholarships
- Exchange Program
- Sample Finals
- Multicultural Center Feasibility