Design & Art

On the Design side, we focused on pinning down our interactions for applying tools on tracks within our track field, brainstorming ideas of reverb applied interactions, and sent out surveys to novice musicians about how people felt about our volume/panning interactions. Looking at the interactions for applying tools on our tracks, we were initially having trouble figuring out the interface. Specifically how to allow our guests to have an interface that both allows them to edit the track with several tools, while not overcrowding the track field. With this UI challenge in mind,  we came up with the sequence below:

  1. Select orb interaction: 
    1.  Press the Trigger when hovering on an audio orb to select.
    2.   The original orb on the track field fades out a little. A copy of the same orb (looking as expanded as when it was hovered over) appears right in front of the guest floating in the air.
    3. A few new pillars with tools (i.e. reverb) rise above the floor slightly to the front right of the guest, with the former pillars (play/pause, stop, loop) pushed slightly to the left.
  2. Apply tool to orb interaction:
    1.  Guests can pick up a tool from the pillar by holding down the Grip button while hovering.
    2.    When grabbed, the tool will snap to the hand, and the pointer on the controller disappears.
    3. When placed on the orb copy, the tool will snap onto the orb, and the corresponding effect will be applied to the orb.
    4. If guests release the grip button while it’s not on the orb, the tool will be reset back to the pillar (same goes for removing the tool from an orb).

This sequence would allow the interface to hold several tools that the guest may want to use, while not interfering with the track field real estate in the space.

For the Reverb designs, we came up with three possible routes to take:

  1. Have two cubes surround the track that represent the dry and wet values of the reverb. The sizes of the cubes would affect the level of dryness/wetness on the track. Also, offsetting the position of the cube on the orb would affect the delay of that particular value. 
  1. Have a large space appear surrounding the main area, allowing the player to manipulate its size. The larger the room, the stronger the wetness of the track.
  2. Pop up the normally associated tools of reverb on the pillars in front of the guest, more closely matching the format of DAWs. They would select the value they wanted to manipulate the track with on the pillar, then apply it on a track by pointing to it on the track field.

Tech

From the tech side, we were successfully able to build our prototypes and send it out for playtesting both for Quest and Rift S.

We were able to learn and build quite a bit through this prototype. From the FMOD API side of things, we implemented passing metering info and spectrum data of the master bus (thanks to this amazing set of tutorials from  Allesandro Fama)

This helped us drive visuals at two levels: 

  1. We used the RMS data from each individual track to drive the size and color visualisation of the orbs which added quite a bit to the look of the orbs. This was important because the metering info helps different orbs that are muted vs orbs that don’t have active audio data coming in at that point in time. 
  2. We did a first pass of environment visuals by using the spectrum data every frame to drive the audio-visualiser that you can see on the side of the main stage. We would use more of this in the future but for now we are limited to getting 4 buckets of spectrum info from the master bus. Scott Game Sounds has an amazing tutorial about FMOD API and the FFT DSP (which was used in this case) to get the spectrum data.

We also looked at a way to show the axes for both panning and volume so that it’s clear where in the “Track Field” the guest is moving around a particular orb. 

We will further refine this to show a more detailed scale for both panning and volume to allow for more finer adjustments. 

One obvious problem to this approach is that we lose resolution in terms of panning for elements that are closer to the guest on the track field. But we think this can be mitigated by adjusting the shape of the track field. Our track field setup is quite dynamic and easy adjustable so that shouldn’t be a big problem. 

Our final touch to our prototype was to add a small “intro moment” as part of our design to make sure that the guest understands that it’s a “Stereo” experience. We created a sound effect that’s sweeps from left -> right and triggered it when the guest wears the headphones. 

Our next steps are:

  1. Finalize what kind of impulse responses to use for the Reverb prototype and setup a framework that can be used to swap Impulse Responses at runtime through the API and adjust the “Wet” and “Dry” levels in real-time.
  2. Implement the Reverb prototype design by the end of week 6 so that we can refine it the week before halves.
  3. Get the timeline information from the songs since we have to use this as a hack to get the “Looping” functionality working. The timeline functionality will be helpful for us in the future too.
  4. Look at ways to support multiple songs. Currently our experience relies on one single cover version of Coldplay’s Viva La Vida. We hope to support more genres for guests who want to experience different kinds of music.