Beatmaps are a staple of many rhythm games, especially those that involve using the same mechanics to play multiple tracks or songs. A bunch of our prototypes used beatmaps (Beat’Em, Gang Beasts, Chopin Beats, Laserfade, Kinectic) to different degrees.
Here’s the approach we took to coming up with beatmapping systems for our various games.
First: what do your notes, cues, actions, etc. look like? What is the player’s primary beat-related action? How many varieties of actions do they have?
In Guitar Hero, these are the gems that fall down to the action bar. There are single notes, hold notes, and hammer-ons/pull-offs.
For the games in this project, most of our notes required the same type of input — a simple button press, or in VR, making contact with the note object by slicing it (Chopin Beats) or pushing it into a collider (Kinectic). However, all of our notes in Kinectic consist of a variety of positions that the note must be pushed into, so that had to be taken into consideration when deciding what functionalities our NoteData and NoteController classes needed to have.
Next: how many positions do your notes occupy?
Simpler rhythm games like Guitar Hero have a small, fixed number of note positions, often limited by controller input. However, games like osu! have notes that can start and end anywhere on the screen, storing note positions as XY coordinates.
When we were playing around with the idea of Kinectic having a grid of note positions surrounding the player on all 4 sides, it was key to solidify the naming convention of these positions before going forward. While we ended up settling with just 1 grid in front of the player, we came up with a naming system that would allow us to adjust the number of rows and columns in the grid.
Finally: how do your game designers want to compose beatmaps? What is the most efficient way for them to compose beatmaps?
This is something that most rhythm game players don’t have a chance to see or experience, but an integral part of building a rhythm game. Most commercial games don’t release their in-house beatmap making systems, but modders (and some indie game developers) often create their own solutions for custom beatmap-making that can be of inspiration.
These tend to have a timeline, representing the track length, along which the beatmap designer can place and snap cues to appropriate beats (or fractions of beats).
We were on a tight timeline for this project, and the giant disparities in design between all our rhythm games meant that it was pretty much impossible for us to create a one-size-fits-all beatmap maker tool (though this would have saved us a lot of time!). Implementing a GUI for a scrubbable timeline was something we decided was not feasible, with such a small team where everyone wore multiple hats (both our programmers are also designers).
Instead, we optimized where we could. When our games had a small variety of notes and positions (e.g. Laserfade, with 6 positions and only 2 possible note types, or Chopin Beets, with 2 positions and 3 possible note types), we used markers in FMOD on the music track itself for composing the beatmap. You can read more about this in our first programming post-mortem about syncing game events to music.
We programmed the cue-spawning systems for these games to read the marker name (a predefined string, such as L1 or R3) and spawn the appropriate object, perfectly synced with the music.
Games that require more complex choreography that is highly dependent on the music, especially VR games, may be best beatmapped “live” by tapping out inputs as the music plays. In this case, it may be best to devise a system that lets designers create beatmaps to a slowed-down version of the music track.
One caveat is that designers can and will make mistakes when beatmapping live, no matter how slowed-down the track is. Using a live beatmapping system in conjunction with a timeline should allow designers to edit their inputs as they go.
In the case of our game, Kinectic, we struggled to come up with a static beatmapping system due to the gesture-intensive nature of the game’s inputs, and decided on making a VR beatmapping system instead. Our system was a matrix of collision boxes in front of the player that our beatmap designer activated by physically drawing lines between them using the Oculus Touch controllers.
Because we had no time to implement a timeline, we decided on using CSVs to store beatmap data in a designer-friendly way, so that our non-programmer designers could easily edit beatmaps. Even with a timeline-based beatmapping system, having an editable file that can be used to make minor adjustments without having to fire up your beatmapping application.
A good practice in Unity is to convert file data like CSVs into ScriptableObject assets, especially because these assets don’t change at runtime. Extending Unity’s editor classes can allow for a highly-customizable display of beatmap data.
Having these things in mind when coming up with data structures and systems saved us a lot of time and effort when we were prototyping our 8 games. If you’re trying to create a beatmap-based rhythm game too, hopefully this short article has given you some food for thought!