Imperial College London > Talks@ee.imperial > COMMSP Seminar > Regularizers for Structured Sparsity

Regularizers for Structured Sparsity

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact Lauren E Noto.

We study the problem of learning a sparse linear regression vector under additional conditions on its sparsity pattern. This problem is relevant in machine learning, signal processing and statistics. We present a framework for structured sparsity which extends the well known methods of Lasso and Group Lasso by incorporating additional constraints on the variables as part of a convex optimization problem. This provides a means of favoring prescribed sparsity patterns, such as orderings, contiguous regions and overlapping groups, among others. We establish some basic properties of these penalty functions, discuss some examples where they can be computed explicitly and present a convergent optimization algorithm for solving the associated regularization problem. On the way, we comment on extensions of the framework to the matrix case, which is relevant in the context of multitask learning, and discuss some applications of the methods presented to affective computing, computer vision and user modelling.

BIOGRAPHY : Massimiliano Pontil is a Professor of Computational Statistics and Machine Learning in the Department of Computer Science at University College London. His research interests are in the area of machine learning with a focus on regularization methods, convex optimization and statistical estimation. He received the equivalent of an MSc and a PhD in Physics from the University of Genova in 1994 and 1999, respectively.

This talk is part of the COMMSP Seminar series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Changes to Talks@imperial | Privacy and Publicity