Compositional interaction descriptor for human interaction recognition

Nam Gyu Cho, Se Ho Park, Jeong Seon Park, Unsang Park, Seong Whan Lee

Research output: Contribution to journalArticlepeer-review

24 Citations (Scopus)

Abstract

In this paper, we address the problem of human interaction recognition. We propose a novel compositional interaction descriptor to represent complex human interactions containing high intra and inter-class variations. The compositional interaction descriptor represents motion relationships on individual, local, and global levels to build a highly discriminative description. We evaluate the proposed method using UT-Interaction and BIT-Interaction public benchmark datasets. Experimental results demonstrate that the performance of the proposed approach is on a par with previous methods.

Original languageEnglish
Pages (from-to)169-181
Number of pages13
JournalNeurocomputing
Volume267
DOIs
Publication statusPublished - 2017 Dec 6

Bibliographical note

Funding Information:
The research was supported by the Implementation of Technologies for Identification, Behavior, and Location of Human based on Sensor Network Fusion Program through the Ministry of Trade, Industry and Energy (Grant No. 10041629). Partial support was also provided by the ICT R&D program of MSIP/IITP [B0101-15-0552, Development of Predictive Visual Intelligence Technology] and Institute for Information & communications Technology Promotion (IITP) grant funded by the Korea government (MSIP) (No. R7117-16-0157, Development of Smart Car Vision Techniques based on Deep Learning for Pedestrian Safety).

Publisher Copyright:
© 2017 Elsevier B.V.

Keywords

  • Compositional interaction descriptor
  • Human interaction recognition
  • Human motion analysis

ASJC Scopus subject areas

  • Computer Science Applications
  • Cognitive Neuroscience
  • Artificial Intelligence

Fingerprint

Dive into the research topics of 'Compositional interaction descriptor for human interaction recognition'. Together they form a unique fingerprint.

Cite this