Real-time video stylization using spatial-temporal gabor filtering

Rui WANG, Ping LI, Bin SHENG, Hanqiu SUN, Enhua WU

Research output: Chapter in Book/Report/Conference proceedingChapter

Abstract

This paper describes a new video stylization approach that achieves non-photorealistic rendering effects by using highly efficient spatial-temporal Gabor filtering. An edge extraction algorithm is developed to detect long coherent edges, to which the human visual system is sensitive. A nonlinear diffusion is then applied to remove unimportant details. Our approach extends the optical flow computation for constructing the Gabor flow to represent pixel similarity, and to preserve the temporal coherence when applied to video sequences. In particular, our video stylization is designed in a spatiotemporal manner to achieve temporal coherence in resulting animations. Real-time performance is achieved through the highly parallel implementation on modern graphics hardware (GPU). Therefore, our video stylization can be naturally applied to real-time video communication and interactive video-based rendering. The experimental results have demonstrated the high-quality production of our real-time video stylization. Copyright © 2016 ACM.
Original languageEnglish
Title of host publicationProceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry
Place of PublicationNew York
PublisherACM
Pages299-307
Volume1
ISBN (Print)9781450346924
DOIs
Publication statusPublished - 2016

Fingerprint

Optical flows
Animation
Pixels
Hardware
Communication
Graphics processing unit

Citation

Wang, R., Li, P., Sheng, B., Sun, H., & Wu, E. (2016). Real-time video stylization using spatial-temporal gabor filtering. In Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry (Vol. 1, pp. 299-307). New York: ACM.

Keywords

  • Gabor flow
  • Video stylization
  • Feature space
  • Temporal coherence
  • Spatial-temporal processing