Imperial College London > Talks@ee.imperial > CAS Talks > LUT-based BNN: Forward Propagation with Arbitrary Binary Operations

LUT-based BNN: Forward Propagation with Arbitrary Binary Operations

Add to your list(s) Download to your calendar using vCal

If you have a question about this talk, please contact George A Constantinides.

Binarisation is the quantisation of parameters into just two values, typically {-1,1} with a scaling factor. Although binary quantisation incurs bigger quantisation error, the arithmetic operations during hardware CNN inference are substantially simplified. For inference, binary networks map multiplications and accumulations into XNOR operations and popcounts, which, on an FPGA , are significantly more area-efficient compared to fixed-point operations. However, can we extract even more hardware efficiency from the LUTs in FPG As? An XNOR operation is only a subset of the complete truth-table of a LUT ; a LUT can perform any boolean operation. To increase the representation power of BNNs, we train the network to inference with arbitrary boolean operation per input activation, including but not limited to XNOR . Using this method, we retrained CNV , a BNN architecture presented by FINN , and improved its accuracy by 2.3 pp on CIFAR -10 dataset.

This talk is part of the CAS Talks series.

Tell a friend about this talk:

This talk is included in these lists:

Note that ex-directory lists are not shown.

 

Changes to Talks@imperial | Privacy and Publicity