Title: Active Learning 2.0: Being Intentionally Inclusive

Date: 11/27/2018

Time: 1:30 PM - 3:00 PM

Place: 252 EH

Active learning has many documented benefits both for students and instructors. Moreover, there is increasing evidence that it disproportionately benefits women, students of color, and students who were previously denied the same learning opportunities as others. However, the empirical evidence for this disproportionate benefit doesn't explain why it happens, nor does it guarantee that all students will benefit from active learning. In fact, my own experience with active learning is that it is difficult to do well and sometimes it can have detrimental effects on students if we're not careful. So, we should aim not just for active learning, but learning that is both active and inclusive. We'll discuss some principles and practical strategies for making active learning more inclusive.

Title: Improving L^1 Poincar\'e inequality on Hamming cube

Date: 11/28/2018

Time: 4:10 PM - 5:00 PM

Place: C517 Wells Hall

L^1 Poincar\'e inequality on hypercube is related to many interesting questions in random graph theory (like Margoulis graph connectivity theorem, e.g.). The sharp constant is unknown, but I will show how to improve the previously known constant \pi/2 obtained by Ben Efraim and Lust--Piquard by using non-commutative harmonic analysis. The approach will be probabilistic and luckily commutative. For Gaussian space the constant is known, it is \sqrt{\pi/2}, and the short proof belong to Maurey--Pisier.

Reproducing kernel Hilbert spaces (RKHSs) are Hilbert spaces of functions on which point evaluation functionals are continuous. Thanks to the existence of an inner product, RKHSs are well-understood in functional analysis. Successful and important machine learning methods based on RKHSs include support vector machines, regularization networks and kernel-based approximation.
In the past decade, there has been emerging interest in constructing reproducing kernel Banach spaces (RKBSs) for applied and theoretical purposes for instance sparse approximation. Recently, we propose a generic definition of RKBS and a framework of constructing RKBSs that unifies existing constructions in the literature, and leads further to new RKBSs. As a by-product, the space C([0,1]) of all continuous functions on the interval [0,1] is an RKBS.
Motivated by sparse multi-task learning, we constructed a class of vector-valued RKBSs with the l1 norm based on multi-task admissible kernels. The relaxed linear representer theorem holds for regularization networks in the obtained spaces if and only if the Lebesgue constant of kernels is uniformly bounded. A class of translation-invariant kernels of limited smoothness admissible for construction are given. Numerical experiments demonstrate the advantages of the proposed construction and regularization models.
This talk is based on two joint papers with Prof. Guohui Song (Clarkson University), Haizhang Zhang (Sun Yat-sen University), and Jun Zhang (University of Michigan).