![]() |
![]() |
Imperial College London > Talks@ee.imperial > CAS Talks > An exploration of regularisation and pruning techniques, hyperparameter optimisations and lightweight NAS for PolyLUT
Log inImperial users![]() ![]() ![]() Information onFinding a talk![]() ![]() ![]() ![]() ![]() |
An exploration of regularisation and pruning techniques, hyperparameter optimisations and lightweight NAS for PolyLUTAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact George A Constantinides. Several strategies can enhance the performance of neural network models, including pruning, regularisation techniques, and optimising model hyperparameters. This study focuses on improving the performance of PolyLUT in terms of latency and area utilisation while maintaining the accuracy. The approach involves a series of steps: initially training the model with an exponential regulariser to foster sparsity, followed by weight-based pruning to retain key features. Subsequently, individual neuron degrees are treated as distinct hyperparameters, and search for optimal combinations is executed using Optuna’s covariance matrix adaptation evolution strategy (CMA-ES). The study expands by employing zero-cost proxies to evaluate their compatibility with this novel type of neural networks. This talk is part of the CAS Talks series. This talk is included in these lists:
Note that ex-directory lists are not shown. |
Other listsUnleash Your Creativity: Customized Stove Burner Covers for a Distinctive Kitchen Aesthetic Type the title of a new list here VISI L.A. Security ForceOther talksKeynote Speech on Influence and persuasion Getting the best out of life |