A robust spatial-temporal line-warping based deinterlacing method

Shing Fat TU, Oscar C. AU, Yannan WU, Enming LUO, Chi Ho YEUNG

Research output: Chapter in Book/Report/Conference proceedingChapters

1 Citation (Scopus)

Abstract

In this paper, a line-warping based deinterlacing method will be introduced. The missing pixels in interlaced videos can be derived from the warping of pixels in horizontal line pairs. In order to increase the accuracy of temporal prediction, multiple temporal-line pairs, selected according to constant velocity model, are used for warping. The stationary pixels can be well-preserved by accuracy stationary detection. A soft switching between spatial-temporal interpolated values and temporal average is introduced in order to prevent unstable switching. Owing to above novelties, the proposed method can yield higher visual quality deinterlaced videos than conventional methods. Moreover, this method can suppress most deinterlaced visual artifacts, such as line-crawling, flickering and ghost-shadow. Copyright © 2009 the Institute of Electrical and Electronic Engineers, Inc.

Original languageEnglish
Title of host publication2009 IEEE International Conference on Multimedia and Expo (ICME 2009)
Place of PublicationPiscataway
PublisherIEEE
Pages77-80
ISBN (Electronic)9781424442911
ISBN (Print)9781424442904
DOIs
Publication statusPublished - 2009

Citation

Tu, S.-F., Au, O. C., Wu, Y., Luo, E., & Yeung, C.-H. (2009). A robust spatial-temporal line-warping based deinterlacing method. In 2009 IEEE International Conference on Multimedia and Expo (ICME 2009) (pp. 77-80). Piscataway: IEEE.

Keywords

  • Deinterlacing
  • Interpolation
  • Line-Warping
  • Resolution enhancement

Fingerprint

Dive into the research topics of 'A robust spatial-temporal line-warping based deinterlacing method'. Together they form a unique fingerprint.