Knowledge distillation learning for the lightweight dynamic convolution in keyword spotting

Donghyeon Kim, Kyungdeuk Ko, Gwantae Kim, Hanseok Ko

Research output: Contribution to journalConference articlepeer-review

Abstract

This paper presents a Knowledge Distillation (KD) learning for a dynamic convolution in the front end. In our previous work, we confirmed that applying a dynamic filtering to the front end of a classifier would improve the performance of the classification in noisy environments. The main goal of this study is to develop a Relational Knowledge Distillation (RKD) framework for dynamic convolution. The teacher model consists of six layers of dynamic convolution. The student model is constructed with a single layer of dynamic convolution, trained by the RKD loss and classifier loss. For performance evaluation, the experiments are carried out by a classification task using Keyword Spotting (KWS). The experimental results show that the proposed KD method improves the KWS performance in noisy environments over the baseline student model, which is trained only by the classifier loss.

Original languageEnglish
JournalProceedings of the International Congress on Acoustics
Publication statusPublished - 2022
Event24th International Congress on Acoustics, ICA 2022 - Gyeongju, Korea, Republic of
Duration: 2022 Oct 242022 Oct 28

Bibliographical note

Publisher Copyright:
© 2022 Proceedings of the International Congress on Acoustics. All rights reserved.

Keywords

  • Dynamic convolution
  • Key word spotting
  • Knowledge distillation
  • Light weight
  • Sound classification

ASJC Scopus subject areas

  • Mechanical Engineering
  • Acoustics and Ultrasonics

Fingerprint

Dive into the research topics of 'Knowledge distillation learning for the lightweight dynamic convolution in keyword spotting'. Together they form a unique fingerprint.

Cite this