Feedback-based adaptive video streaming over lossy channels

Jong Ok Kim, Hideki Tode, Koso Murakami

Research output: Contribution to journalArticlepeer-review

Abstract

In packet networks including the Internet and commercial 3G wireless bearers, the network states that a streaming media application experiences are not known a priori and exhibit time-varying characteristics. For such dynamic environments, network-adaptive techniques are essential to efficiently deliver video data. In this paper, we propose a frame-based optimal scheduling algorithm which incorporates a MAP (Maximum A Posteriori) framework for adapting to varying network loss rate. The optimal transmission schedule is determined such that effective frame-rate is maximized at playback. Also, for multiple packets per frame, frame-based selection of delivery order greatly reduces computational complexity for a server scheduler when compared with packet-based scheduling techniques. In addition, by dynamically estimating instantaneous packet loss probability, the proposed scheduler performs network-adaptive transmission for streaming video over time-varying packet networks. Simulation results for test video sequence show that the proposed scheduling algorithm outperforms conventional ARQ-based schemes from a view point of reconstructed video quality as well as playable frame-rate. In particular, the proposed scheduling algorithm exhibits significant improvements of frame-rate over highly lossy channels.

Original languageEnglish
Pages (from-to)3076-3084
Number of pages9
JournalIEICE Transactions on Communications
VolumeE87-B
Issue number10
Publication statusPublished - 2004 Oct
Externally publishedYes

Keywords

  • ARQ
  • Lossy channel
  • Packet loss recovery
  • Packet scheduling
  • Streaming video

ASJC Scopus subject areas

  • Software
  • Computer Networks and Communications
  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Feedback-based adaptive video streaming over lossy channels'. Together they form a unique fingerprint.

Cite this