For our quarter milestone, we spent time refining our ideas and formulating a plan of action. This was a big week for in terms of Design, as we finalized our mission for the rest of the project and came up with a few key points describing our goals.

We are:

  • Prototyping and experimenting
  • Visualizing song mixing tools/concepts
  • Creating a playground for discovering mixing tools/concepts

We are NOT:

  • Recreating/replacing a Digital Audio Workstation (DAW) in VR
  • Creating a purely educational tool
  • Creating spatial audio

Design: Concepts and layouts

We discussed and created rough concepts and storyboards for our first playable prototype of the basic functionalities.

Main Area

This is a rough depiction of the representation for the music tracks and controls in the VR space. We based this design on the album art of Paul McCartney’s Press single. The tracks are laid out as if in an amphitheater and the player stands in the middle, as if they are the conductor.

Track Selection

Each track of the loaded song can be tweaked using the controller by pointing to it and pressing the triggers/buttons. This is done to allow interaction with objects that are far from the player.

Tool Selection

Once a track is selected, a small fan-out menu with a few buttons opens up. The player can then select a tool from this menu to apply to the selected track. Ex. : reverb, EQ, etc.

Reverb Visualization

This is a sample concept for the visualization of the reverb tool. For a reverb preset of cathedral, we could render the inside of a cathedral to let the player associate the effect of the tool to a real life concept.

Art: Deciding on the style

Mood board for the art style

Tech: Programming the interface

From the tech side of things, we wanted to focus on getting the basic functionalities working and implement the look of the “Home Screen” along with the interactions for handling the tracks. We were able to successfully get the following functionality working in a very short time span

  1. Playback functionality for the entire song
    1. Play
    2. Pause
    3. Stop
  2. Global Mute/Unmute

We also managed to get the following interactions working:

  1. Using the controller trigger for all the functionalities mentioned above
  2. Being able to drag the Track orb on the “Track field” and constrain it on this stage like plane

These functionalities will then be used to map Volume and Panning controls for the track orbs since this is our next iteration on the interaction for handling levels and panning. 

We also looked at adding the FFT DSP unit programmatically onto the master bus since we will be driving some basic audio reactive visuals that aren’t very costly from the audio spectrum data that’s coming out of the master bus

The goal for next week tech wise would be:

  1. Finish implementing the mapping and visuals for panning and levels for the orbs.
  2. Implement the visuals for muted/unmuted tracks and “Active audio” tracks from the audio data that comes out of each track’s channel group.
  3. Create a framework that could be used to drive any sort of visuals for both entire song (master bus) and individual tracks.
  4. Start researching on the DSP API for Reverb and EQ since those will be the next tools that we will target.