We built a VR experience from scratch to test the limits of a custom interactive music system. This came about as a partnership with Pusher Music called “CLOAK Studios”
In 2016, when consumer-grade virtual reality equipment was becoming readily available, there was a lot of conversation surrounding the implementation of 3D audio. But being musicians and music producers, we wanted to experiment with 3D music in volumetric, trackable space, to see how it could be used to highlight what we believed to be the most compelling use case of virtual reality — crafting empathic stories for the user.
We were particularly interested in the idea of allowing music artists to create companion experiences to their albums, which would invite their fans to actually interact with the music, rather than watch it passively like a music video. The result of many late nights, hair pulling, and breakouts from soggy headsets was a musical VR experience created in Unity, from soup to nuts, for the HTC Vive called Fragile Palace. We premiered the experience at the Sound of New Realities forum and discussed it in a lecture at the VRLA conference in 2017.
As the user steps on the feet, they are transported to a room made of glass. The user is free to walk around and explore their environment. They find a chisel. As they walk toward a wall, a musical drone emanating from it becomes louder. The user is encouraged to use the chisel to break the glass walls and push them around with the other hand.
Once the glass structure that encapsulates the user is shattered, the user can then transport to any glass shard floating in the endless space. As space around opens up, the music unfolds into a blissful yet melancholy ambiance.
For us, it was important that we strike a balance between maintaining the integrity of the musical composition while giving the user the opportunity to shape the world around them and hear things in their own unique way. Together with our lead developer Christian Bigham, we used WWISE to drive our custom interactive music system. The foundation for all of the audio was a lush, ambient score composed by Ian Miller which we deconstructed into a series of loopable stems and one-shot SFX. The collisions of the glass shards would trigger a series of tuned bells, forming beautiful chords that overlaid harmoniously on top of the score.
The intensity and structure of the score were created by layering loops, triggered by code, that calculated the amount of destruction caused to the glass structure. The reactive sound sources were created using volumetric RTPC (real-time parameter controls), where we tracked the user’s proximity and actions and applied musical results accordingly. This gave the user the sensation that they truly were shaping the outcome of the world around them while allowing us to progress the musical score in a way that would always conclude the experience in a predictable way.
“Our favorite part of building this experience was getting to hear from people how they felt when playing it.”
- Jerry Yeh
Not only was Ian’s music beautiful on its own, but deconstructing it in a way that welcomed users to re-assemble it through their physical movements and actions felt like a promising new way to tell musical stories. Although VR isn’t currently experiencing the media hype it once had, we believe it will have its moment. We’re so excited to keep exploring the future of music in interactive spaces.
Check out a full video screen-cap of the experience below: