Imperial College London > Talks@ee.imperial > CAS Talks > An exploration of regularisation and pruning techniques, hyperparameter optimisations and lightweight NAS for PolyLUT

An exploration of regularisation and pruning techniques, hyperparameter optimisations and lightweight NAS for PolyLUT

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact George A Constantinides.

Several strategies can enhance the performance of neural network models, including pruning, regularisation techniques, and optimising model hyperparameters. This study focuses on improving the performance of PolyLUT in terms of latency and area utilisation while maintaining the accuracy. The approach involves a series of steps: initially training the model with an exponential regulariser to foster sparsity, followed by weight-based pruning to retain key features. Subsequently, individual neuron degrees are treated as distinct hyperparameters, and search for optimal combinations is executed using Optuna’s covariance matrix adaptation evolution strategy (CMA-ES). The study expands by employing zero-cost proxies to evaluate their compatibility with this novel type of neural networks.

This talk is part of the CAS Talks series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Changes to Talks@imperial | Privacy and Publicity