Imperial College London > Talks@ee.imperial > CAS Talks > The state-of-the-art in meta-learning and hyper-parameter optimization
Log inImperial users Other users No account?Information onFinding a talk Adding a talk Syndicating talks Who we are Everything else |
The state-of-the-art in meta-learning and hyper-parameter optimizationAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact George A Constantinides. Currently many machine-learning specialists spend a lot of time tuning hyper-parameters to ensure they extract the highest achievable accuracy and performance from their networks. As networks continue to get deeper they take increasingly long to train. Having to do multiple runs to tune hyper-parameters wastes a lot of time from highly qualified specialists. Meta-learning and hyper-parameter optimization are techniques commonly used to automate this tedious process. Common approaches include analytical or learning-to-learn methods to determine the best hyper-parameter configuration for a given neural net. This talk is part of the CAS Talks series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsType the title of a new list here Type the title of a new list hereOther talksControl design for linear boundary control systems in port-Hamiltonian form Function Structures and Distributed Computing Battling the Extreme with Resilient Power Grids: How ready are we? |