Imperial College London > Talks@ee.imperial > CAS Talks > Redundancy in Neural Network Hardware and A Case Study of Parameter Precision

Redundancy in Neural Network Hardware and A Case Study of Parameter Precision

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact George A Constantinides.

Two levels of redundancy are widely found in modern neural networks. Model level and data level. As a result, hardware systems for neural networks are limited in regards to their applicability within power-constrained hardware environments due to the required high compute and memory complexity. In this talk, these two levels of redundancy will be introduced. Furthermore, a case study in data-level redundancy will be presented about parameter precision and its impacts to model accuracy as well as hardware implementation on FPGA -based digital systems.

This talk is part of the CAS Talks series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Changes to Talks@imperial | Privacy and Publicity