CS Friday Seminar Series - "Radial Basis Function Networks Unshackled"

Friday, 18 November 2016 - 11:00am - 12:00pm
Dr. Mikhail Belkin
SL 012

The Department of Computer & Information Science hosts Dr. Mikhail Belkin for our Computer Science Friday Seminar series.  Dr. Belkin will present a talk on "Radial Basis Function Networks Unshackled." The seminar will be held Friday, November 18th at 11am in Room SL 012.

Abstract: Radial Basis Function (RBF) networks are a classical family of algorithms for supervised learning.The most popular approach for training RBF networks has relied on kernel methods using regularization based on a norm in a Reproducing Kernel Hilbert Space (RKHS), leading to algorithms such as Support Vector Machines, a principled and empirically successful framework for machine learning.

In this talk I will revisit some of the older approaches to training the RBF networks from a modern perspective. I will discuss two common regularization procedures, one based on the square norm of the coefficients in the network and another one using centers obtained by k-means clustering. It turns out that both of these methods can be recast in terms of a certain data-dependent kernels. We provide a theoretical analysis of these methods as well as a number of experimental results, pointing out very competitive experimental performance as well as certain advantages over the
standard kernel methods in terms of both flexibility (incorporating of unlabeled data) and computational complexity.  In this context I will also discuss ideas for scaling these methods to cope with large modern data.

Finally, our results shed light on some impressive recent successes of using soft k-means features for image recognition and other tasks.

Mikhail Belkin is an Associate Professor in the departments of Computer Science and Engineering and Statistics at the Ohio State University.  He received a Ph.D. in Mathematics from the University of Chicago in 2003.  His research focuses on understanding structure in data, the principles of recovering these structures and their computational, mathematical and statistical properties. His work includes algorithms such as Laplacian Eigenmaps and Manifold Regularization which use ideas of classical differential geometry for analyzing non-linear high-dimensional data.  He is a recipient of an NSF Career Award and has served on editorial boards of the Journal of Machine Learning Research and IEEE PAMI.