Querying Video Libraries

Eenjun Hwang, V. S. Subrahmanian

Research output: Contribution to journalArticlepeer-review

27 Citations (Scopus)


There is now growing interest in organizing and querying large bodies of video data. In this paper, we will develop a simple SQL-like video language which can be used not only to identify videos in the library that are of interest to the user, but which can also be used to extract, from such a video in a video library, the relevant segments of the video that satisfy the specified query condition. We investigate various types of user requests and show how they are expressed using our query language. We also develop polynomial-time algorithms to process such queries. Furthermore, we show how video presentations may be synthesized in response to a user query. We show how a standard relational database system can be extended in order to handle queries such as those expressed in our language. Based on these principles, we have built a prototype video retrieval system called VIQS. We describe the design and implementation of VIQS and show some sample interactions with VIQS.

Original languageEnglish
Pages (from-to)44-60
Number of pages17
JournalJournal of Visual Communication and Image Representation
Issue number1
Publication statusPublished - 1996 Mar
Externally publishedYes

Bibliographical note

Funding Information:
* This work was supported by the Army Research Office under Grant DAAL-03-92-G-0225 and grant DAAH–04–95–10174, by the Air Force Office of Scientific Research under Grant F49620-93-1-0065, and by ARPA/Rome Labs contract F30602-93-C-0241 (ARPA Order A716).

ASJC Scopus subject areas

  • Signal Processing
  • Media Technology
  • Computer Vision and Pattern Recognition
  • Electrical and Electronic Engineering


Dive into the research topics of 'Querying Video Libraries'. Together they form a unique fingerprint.

Cite this