Imperial College London > Talks@ee.imperial > CAS Talks > Now that I can predict, I can adapt: Towards CNN adaptation of Edge GPUs through modelling of the training process
Log inImperial users Other users No account?Information onFinding a talk Adding a talk Syndicating talks Who we are Everything else |
Now that I can predict, I can adapt: Towards CNN adaptation of Edge GPUs through modelling of the training processAdd to your list(s) Download to your calendar using vCal
If you have a question about this talk, please contact George A Constantinides. With an increase in the memory and processing capabilities of today’s edge devices, there is a push towardsincreased edge intelligence. For Convolutional Neural Networks (CNNs), pruning and retraining networks on theedge, adapting their structure to the data being processed and resulting in systems with lower memory, latencyand power consumption, is a desirable capability towards achieving edge intelligence. However, the limitedcompute resources and memory budget of edge devices restrict the model size that can be trained under a certaintime budget. This work describes an automated methodology for developing accurate models that predict CNNtraining memory consumption and latency given a target device, network and pruning level. With PyTorch asthe framework and NVIDIA Jetson TX2 as the target device, the developed models predict training memoryconsumption and latency with 91.0% and 88.8% accuracy respectively for a wide range of networks and pruninglevels. Additionally the work develops a classification model that predicts with 93.4% accuracy whether a givennetwork and pruning level will perform training on the target device given the available memory. This talk is part of the CAS Talks series. This talk is included in these lists:Note that ex-directory lists are not shown. |
Other listsIEEE Talks Type the title of a new list here Type the title of a new list hereOther talksIntegration of Distributed Energy Resources via the Low Carbon Energy Platform |