Interdisclinarizing sound, music and virtual reality for a multimedia performance

Research output: Contribution to conferencePaper

Abstract

The collaboration among different art forms for a multidisciplinary performance have long been a key issue in performing arts which was bounded by the informed practices deeply rooted in their historical development. This presentation demonstrates an approach that utilizes the immersive characteristics of virtual reality (VR) to integrate sound, music, and media for a multimedia performance. It attempts to address the lack of interactivity between different art forms from previous approaches by incorporating VR, spatial audio and other up-to-date digital technologies for an interdisciplinary performance. The approach adopts a BYOD strategy that requires audiences to use their own smartphones as the 360-degree viewing device with VR headset provided. Pre-recorded and rendered 360- degree video forms the visual component of the performance, which is broadcasted lively on online streaming platform in order to synchronise the content for audiences. While immersive visual experience could be enabled by the VR technology, immersive spatial audio could be made by having musicians and/or speakers surrounding the audiences. Sound and music could be designed and performed in various format, including scored music, electronic improvisation, live coding, and mixed ensemble according to the visual content and the availability of resources. Both visual and audio elements could be altered in real-time, enabling interactions between different art forms.
Original languageEnglish
Publication statusPublished - Jun 2017

Fingerprint

Virtual reality
Acoustic waves
Smartphones
Availability

Bibliographical note

Cheng, L. (2017, June). Interdisclinarizing sound, music and virtual reality for a multimedia performance. Paper presented at the Innovation in Performing Arts Education Symposium 2017: Performance, possibilities, pedagogy, The Hong Kong Jockey Club Amphitheatre, Hong Kong, China.