Multitask learning of deep neural network-based keyword spotting for iot devices

Seong Gyun Leem, In Chul Yoo, Dongsuk Yook

Research output: Contribution to journalArticlepeer-review

21 Citations (Scopus)


Speech-based interfaces are convenient and intuitive, and therefore, strongly preferred by Internet of Things (IoT) devices for human-computer interaction. Pre-defined keywords are typically used as a trigger to notify devices for inputting the subsequent voice commands. Keyword spotting techniques used as voice trigger mechanisms, typically model the target keyword via triphone models and non-keywords through single-state filler models. Recently, deep neural networks (DNNs) have shown better performance compared to hidden Markov models with Gaussian mixture models, in various tasks including speech recognition. However, conventional DNN-based keyword spotting methods cannot change the target keywords easily, which is an essential feature for speech-based IoT device interface. Additionally, the increase in computational requirements interferes with the use of complex filler models in DNN-based keyword spotting systems, which diminishes the accuracy of such systems. In this paper, we propose a novel DNN-based keyword spotting system that alters the keyword on the fly and utilizes triphone and monophone acoustic models in an effort to reduce computational complexity and increase generalization performance. The experimental results using the FFMTIMIT corpus show that the error rate of the proposed method was reduced by 36.6%.

Original languageEnglish
Article number8641328
Pages (from-to)188-194
Number of pages7
JournalIEEE Transactions on Consumer Electronics
Issue number2
Publication statusPublished - 2019 May


  • Deep neural network
  • keyword spotting
  • multitask learning

ASJC Scopus subject areas

  • Media Technology
  • Electrical and Electronic Engineering


Dive into the research topics of 'Multitask learning of deep neural network-based keyword spotting for iot devices'. Together they form a unique fingerprint.

Cite this