Early exit techniques, where the inference process of convolutional neural networks (CNNs) is early terminated by using auxiliary classifiers (called branches) to reduce data processing energy for easy inputs, have been actively researched. However, the conventional early exit works have suffered from the large size of branches, whose memory access energy is significantly large compared to that of the main network. In this article, we propose a low-cost early exit network (LoCoExNet), which significantly improves energy efficiencies by reducing the parameters used in inference with efficient branch structures. To further reduce the energy consumption of branches, based on the observation that the classification difficulties of the images can be predicted using the magnitudes of input activations, we also propose hardware-friendly dynamic branch pruning (DBP). Given the network and target dataset, we can construct an energy-efficient early exit network through the branch structure determination algorithm and DBP algorithm. Finally, we develop several architectural supporting modules to improve the energy efficiency of the proposed LoCoExNet with DBP. The experimental results show that the LoCoExNet uses 28.7% of the total network parameters on average for Tiny-ImageNet dataset using VGG-16 without accuracy loss. The CNN accelerator that implements LoCoExNet with DBP has been implemented using the 65nm CMOS process. The implementation results show that the LoCoExNet accelerator achieves up to 91% of energy savings and up to 4.46× of speedups without accuracy loss for CIFAR-10 dataset using VGG-16 compared to the state-of-the-art CNN accelerators.
|Number of pages
|IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems
|Published - 2023 Dec 1
Bibliographical notePublisher Copyright:
© 1982-2012 IEEE.
- Convolutional neural networks (CNNs)
- early exit
- energy-efficient accelerator
ASJC Scopus subject areas
- Electrical and Electronic Engineering
- Computer Graphics and Computer-Aided Design