Kinectic Postmortem

Prompt: How can we innovate on full-body motion experiences in VR?


Prototype Goals

  • Can we make a rhythm game that encourages full-body movement in VR?
  • Can we make a rhythm game that uses continuous input?

Details

Platform: Oculus Rift

Players: 1

Controllers: Oculus Touch


Description

In this game, players must push ball-shaped cues that appear in front of them with a corresponding hand, tracing a line through spherical targets with their movements. The cues appear and disappear with the music. When players are in the flow of the game, it should look like they’re dancing.


Instructions

Push the colored balls along the path that appears, using the hand of the corresponding color. To get the highest score, touch each spherical target with the ball when it is completely filled with color.


Iteration Thoughts

Music

  • Initial music idea consisted of a more standard “layered” soundtrack concept
    1. A base layer keeps rhythm
    2. Additional layers fade in as player interacts with objects
  • After several iterations of gameplay, beatmap exporations, and playtests, it became apparent that this format was lacking and did not help build flow for players. It was useful for the improvisational aspect, however different/greater structure was needed for the feeling of purpose.
  • These initial tests helped show what kind of physical actions (and thus flow) were most comfortable and engaging in VR.
  • After solidifying that pushing on a 2D plane was the core aspect of the gameplay, a “beatmap/choreography first” approach was instead proposed to our resident dancer, Shana. 
  • She demonstrated fluid, uninterrupted sweeps with her arms at a moderate tempo.
  • The soundtrack therefore decided to be song based, with the melody following and promoting player action
  • The soundtrack contains a series of uninterrupted lines, with each note in the line corresponding to the amount of spheres/waypoints that will be interacted with.
    • Difficulty based on position and location was done in person, but composing the soundtrack alongside a natural progression of difficulty had to be taken into account. Because the format of the lines was easy to follow, we started with a simple phrase consisting of 2-4 notes. Each subsequent phrase added a note (3-5, 4-6, all the way to 9). 
    • Because the soundtrack was in 4/4, and each phrase was either 2/4 or a full 4/4, there came a question about how to craft lines of longer than 4 notes. Changing meter was used to incorporate a line of 9 notes in 6/4. 
  • At first there was a worry about including a non standard/changing meter to a rhythm game.
  • Due to the natural, predictable progression and flow that the musical lines and beatmap created, there were no reports of playtesters being caught off guard.
  • The underlying beat that is provided only accents the start of a phrase, and gives the feeling of the 8th note.
    • It is not the driving force of the player’s action, it merely provides the necessary context of tempo.

Design

  • While we originally visualized the player being surrounded by a grid of possible points for cues to spawn, we realized that raised questions about how direct the player’s attention effectively to targets that were outside of their view. In the end, we decided that this was more of a VR UI/UX design decision which wasn’t a priority for us to iterate on, so we made targets spawn on a single 3 x 6 grid in front of the player (because the grid is invisible, players should not have an overt awareness that cues are snapped to a grid).
  • One iteration of the game had players touching cues with their hands to bring them to a destination point, snapping cues to players’ hands upon contact. This resulted in players using a very different motion to complete notes compared to the more deliberate pushing action we wanted to see. Instead, we snapped notes to a plane in front of the player and allowed players to push the note along the plane.
  • With the limited time we had to get this prototype playable, we had to iterate on what a beatmap looked like and how best to create them. While we had a VR beatmap maker, we didn’t have the resources to include features such as real-time editing or deleting notes — only adding to and overwriting the whole beatmap at a time. We allowed for beatmaps to be mapped out using 1 hand at a time to reduce the load on our beatmap designer.
  • Beatmap – while working on the beatmap maker we started to visualize what this prototype could look like by creating animatics in Maya. At first, we created one when we thought this would be 360 degrees, then we created one when cut it down to the spheres right in front of you. Using this animatic as inspiration, we then started showcasing possible movements to see how that may affect the music that was created. From there we went into VR using the beatmap maker. When beatmap making, we tried to have repetition in certain parts, as well as have the next item spawn in places near where the player would finish their last note. 

Lessons Learned

  • We built this prototype to explore continuous player actions, which focuses on having players act according to the music’s flow and melody, rather than discrete beats or sound cues. However, programmatically speaking, we could really only evaluate the accuracy of player actions at discrete points. While we could have made players follow a really precise path with their motions, with the limitations of VR field-of-view and attention management, we decided that a more simplified input would be easier to both implement and playtest. Our final input scheme, “pushing a note along a path”, is still a continuous motion, but it would be interesting to explore more complex paths that happen around the player, instead of only in front.
  • A VR rhythm game that involves choreographed movements needs a VR beatmapping system. However, it’s difficult enough to create a fully functional beatmapping system for regular PC use, let alone in VR. The basic VR beatmap maker we came up with was sufficient, but could have had more features such as a timeline, etc. What could we do with more time?

Future Iterations

  • Come up with a better VR beatmapping system. What would an ideal one look like?