This week was spent on answering a few technical questions that the team had. We did this by creating a Gold Spike build. The purpose of the gold spike was to make sure our entire pipeline was working as expected. Since Audio middleware was a new domain for all of us in the team, we wanted to get familiar with it as quickly as possible.

We also wanted to verify that URP and Shadergraph were working consistently with both Oculus Rift S and Quest builds since these are the platforms we are targeting. To complete the full pipeline, it would help us ensure that both Mac and Windows guests are able to playtest our experience.

Tech : How the FMOD pipeline works

 (NOTE: We are currently working with a single music track)

  1. The starting point for us is a folder of stem tracks from one song. A stem track is a complete bounce of a single element of the mix.
  2. For the FMOD setup, the whole song’s stems are put into one FMOD Bank and each stem is loaded as a 2D event. This was done because it helps add DSPs to each of it’s channel group through it’s API.
  3. Each track is set to default volume/pan levels and the master bus is set to 0 dB and 0 pan. 
  4. All the events are triggered at the same time to mimic a song play/stop/pause.
  5. Each of the channel group’s mute, pan and volume levels are set through the API from Unity
Tech pipeline

We were then successfully able to play/stop all tracks at once to mimic a DAW’s regular functionality. We were also able to control the panning and volume levels to the position and the sphere size respectively. 

From the URP side, we were able to throw in a test shader from Shadergraph and make it work inside both Rift and Quest builds. To help us speed up the build process, we are in the process of creating a batch script that can checkout the latest changes from Perforce -> Build Rift S + Quest Build all with a single command. We will probably run this on our dedicated demo computer that would act as a makeshift cloud build machine.

Screenshots

Next week, we will focus on the implementation of all basic functionalities for the start of the experience for guests to navigate the song’s tracks easily. We will also be looking at seeing how to generate visualizations for the experience using the spectrum data from the FMOD API.