Acoustic and visual signal based context awareness system for mobile application

Woo Hyun Choi, Seung Il Kim, Min Seok Keum, David K. Han, Hanseok Ko

Research output: Contribution to journalArticlepeer-review

15 Citations (Scopus)


In this paper, an acoustic and visual signal based context awareness system is proposed for a mobile application. In particular multimodal system is designed that can sense and determine, in real-time, user contextual information, such as where the user is or what the user does, by processing acoustic and visual signals from the suitable sensors available in a mobile device. A variety of contextual information, such as babble sound in cafeteria, user's movement, and etc., can be recognized by the proposed acoustic and visual feature extraction and classification methods. We first describe the overall structure of the proposed system and then the algorithm for each module performing detection or classification of various contextual scenarios is presented. Representative experiments demonstrate the superiority of the proposed system while the actual implementation of the proposed scheme into mobile device such as a smart-phone confirms the effectiveness and realization of the proposed system.

Original languageEnglish
Article number5955216
Pages (from-to)738-746
Number of pages9
JournalIEEE Transactions on Consumer Electronics
Issue number2
Publication statusPublished - 2011 May


  • Context awareness
  • audio and visual classification
  • environmental sound classification
  • mobile device

ASJC Scopus subject areas

  • Media Technology
  • Electrical and Electronic Engineering


Dive into the research topics of 'Acoustic and visual signal based context awareness system for mobile application'. Together they form a unique fingerprint.

Cite this