Design & Art

On the design side of things, the focus this week was gathering feedback on our previous reverb prototype, along with focusing on the design for our current prototype, automation and timeline. We were ahead of schedule with the design of automation, having the base designs completed at the beginning of our sprint.  The details of our implementation can be explained with this picture:

The timeline of the song is displayed as a horizontal arc in front of the track field, with an interactive sphere used to select the position of the timeline on the track field. To automate the orb, guests have to select the orb by moving it with the grip and trigger around the track field. This will leave a colored trail that shows the path the orb will take in the automation. Moving the timeline back to before the guest moved the orb will cause the orb to automate the created path in the song.

By the end of the week, we were able to sit together to check out how this approach looked in the prototype itself.

As it is probably clear in the image grabbed from our review of the prototype, placing all of the trails on the track field causes things to get very cluttered and ugly. Not only is it hard to understand which lines correspond to each orb, but you cannot know when these lines activate within the song. 

We decided to live code to tweak the automation, in order to try to come up with some quick solutions. So we made the automation trails move from top to bottom throughout the song, emulating the sand falling in an hourglass.

Tech

On the tech side of things, our initial approach was to implement the Timeline similar to how they are generally represented in any audio-video editing software. The way we had implemented multi-track audio as separate events didn’t make this straightforward.

Timeline callbacks (an example on the FMOD forum: timeline scripting ) is a popular system used through the FMOD API especially because it’s very handy when it comes to implementing rhythm games and situations like where we need to dynamically handle adding background music or even some sound effects. This can be achieved with simple parameters and marker settings through the FMOD interface. But this system wasn’t enough to give a smooth timeline, and a callback system is not really efficient for getting the play head for a regular timeline that we wanted to show. 

We then moved on to completely relying on the getTimelinePosition of the event reference to get the current state of the play head. We assign one of the tracks as the “master timeline reference” track and use the position of it to determine the timeline position, of course, assuming all the other events are going to follow the same time. This also led us to a possible bug fix for the “lag” issue of some tracks that we noticed earlier in our prototypes. With that we can now move the play head and play from any point like in any other editing software! 

We then moved to the next part of it where we had to implement recording and storing the change in position of an Orb (track) when the guest wants to “Automate Panning and Volume” . We researched a bunch of options using the Animation component in Unity and using the clip to store this data but we quickly realized we could handle it purely through using Unity’s AnimationCurves class. We are now using 3 animation curves per track orb to track the local position of the track orbs and in-turn record it’s panning and volume automation for any time in the timeline. And we fell back to our old friend LineRenderer as usual to visualize this automation data. Next week we will be concentrating on getting this visualization more solid and easier on the eye but until then check out the fun we were having with it already!