Abstract
This paper describes a new video stylization approach that achieves non-photorealistic rendering effects by using highly efficient spatial-temporal Gabor filtering. An edge extraction algorithm is developed to detect long coherent edges, to which the human visual system is sensitive. A nonlinear diffusion is then applied to remove unimportant details. Our approach extends the optical flow computation for constructing the Gabor flow to represent pixel similarity, and to preserve the temporal coherence when applied to video sequences. In particular, our video stylization is designed in a spatiotemporal manner to achieve temporal coherence in resulting animations. Real-time performance is achieved through the highly parallel implementation on modern graphics hardware (GPU). Therefore, our video stylization can be naturally applied to real-time video communication and interactive video-based rendering. The experimental results have demonstrated the high-quality production of our real-time video stylization. Copyright © 2016 ACM.
Original language | English |
---|---|
Title of host publication | Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry |
Place of Publication | New York |
Publisher | ACM |
Pages | 299-307 |
Volume | 1 |
ISBN (Print) | 9781450346924 |
DOIs | |
Publication status | Published - 2016 |
Citation
Wang, R., Li, P., Sheng, B., Sun, H., & Wu, E. (2016). Real-time video stylization using spatial-temporal gabor filtering. In Proceedings of the 15th ACM SIGGRAPH Conference on Virtual-Reality Continuum and Its Applications in Industry (Vol. 1, pp. 299-307). New York: ACM.Keywords
- Gabor flow
- Video stylization
- Feature space
- Temporal coherence
- Spatial-temporal processing