Abstract
As the convolutional neural network becomes complex and datasets become huge, a lot of time is spent training the network. In this paper, we propose to mitigate this phenomenon with a mini-batch skipping strategy based on an arithmetic mean of confidence score of images. By skipping the unimportant mini-batch on the training phase, the mini-batch skipping provides saving a lot of time on backpropagation and weight update. We empirically demonstrate the effectiveness of our method with Resnet-18, Resnet-50, and mobilenet-v2 on Cifar-10 and Cifar-100. For Res-net-50, mini-batch skipping gives a 1.39x speedup in training operation without significant accuracy drop.
Original language | English |
---|---|
Title of host publication | Proceedings - International SoC Design Conference, ISOCC 2020 |
Publisher | Institute of Electrical and Electronics Engineers Inc. |
Pages | 129-130 |
Number of pages | 2 |
ISBN (Electronic) | 9781728183312 |
DOIs | |
Publication status | Published - 2020 Oct 21 |
Event | 17th International System-on-Chip Design Conference, ISOCC 2020 - Yeosu, Korea, Republic of Duration: 2020 Oct 21 → 2020 Oct 24 |
Publication series
Name | Proceedings - International SoC Design Conference, ISOCC 2020 |
---|
Conference
Conference | 17th International System-on-Chip Design Conference, ISOCC 2020 |
---|---|
Country/Territory | Korea, Republic of |
City | Yeosu |
Period | 20/10/21 → 20/10/24 |
Bibliographical note
Publisher Copyright:© 2020 IEEE.
Keywords
- Convolutional Neural Network (CNN)
- mini-batch skipping
ASJC Scopus subject areas
- Energy Engineering and Power Technology
- Electrical and Electronic Engineering
- Instrumentation
- Artificial Intelligence
- Hardware and Architecture