Imperial College London > Talks@ee.imperial > CAS Talks > The state-of-the-art in meta-learning and hyper-parameter optimization

The state-of-the-art in meta-learning and hyper-parameter optimization

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact George A Constantinides.

Currently many machine-learning specialists spend a lot of time tuning hyper-parameters to ensure they extract the highest achievable accuracy and performance from their networks. As networks continue to get deeper they take increasingly long to train. Having to do multiple runs to tune hyper-parameters wastes a lot of time from highly qualified specialists. Meta-learning and hyper-parameter optimization are techniques commonly used to automate this tedious process. Common approaches include analytical or learning-to-learn methods to determine the best hyper-parameter configuration for a given neural net.

This talk is part of the CAS Talks series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Changes to Talks@imperial | Privacy and Publicity