From JOURNEY to ERICA
A primer on interactive music
Probably the single most common question I get from students and journalists alike is “what are the differences between scoring a film and scoring a game?” Depending on my mood the answer is either very short or very long, but always seems to end up at this point:
Games are interactive and films are not.
Not a particularly novel conclusion but still, it’s really the most important difference to my mind. And one of the reasons why is because games, as a media category, run a vast gamut of play-types and flavors. Nevermind finding the right notes to compose; because of the huge variety in narrative styles and game mechanics, no two scores ever seem to call for identical implementation.
After my first true game, flOw (also thatgamecompany’s debut game, in 2006), I became rather obsessed with interactive (aka “adaptive”) scores in games. So here is a rundown of different games I’ve done and what defined their approach, as just a cursory look at the range of implementation methods possible in games.
For those who’ve played this game, you know how minimalist its design is. All the storytelling is environmental, and the experience is largely dictated by the player’s exploration (and interactions with other players). Due to the lack of on-screen text or voiceover, music essentially reigns supreme as the narrative device.
Because of how central the music is to the experience, it was my goal to make it feel as meaningfully reactive as possible. While I wasn’t interested in literal “Mickey Mousing,” it was crucial that the player never feel that the score is ambient or ambivalent to their choices. It helped that the game is pretty linear overall, so I could make educated guesses about the player’s choices at every moment of the game. Open-world or non-linear games exponentially magnify this challenge (more on that later!).
The resulting score is a constant interplay of various systems.
First, every area of the game was mapped musically, to ensure that the narrative purpose was captured moment-by-moment. In the example below, the player approaches the open desert for the first time. The music is a very simple, mostly electronic / textural piece with a long bass flute solo. The flute solo will only play once, but its length is about that of the walking distance to the first landmark. If the player lingers, the gentle textural music will but the flute melody won’t continue. This simple gesture staves off the overt feeling of repetition.
Once the player reaches the landmark (the wreckage) and releases the little flying kite creatures, the energy level immediately jumps up a notch into a sort of playful dance. This new piece is also constructed to last approximately the distance to the next landmark (another piece of wreckage) BUT will loop and remix itself (mostly, again, through the successive removal of layers) so that if you decide to skip that next spot and just wander, it won’t become repetitive. In other words, it’s designed to the last the length of traveling A to B to C, but if the player wanders and goes from A to C, the music can carry that experience as well.
On top of that, if you are online and successfully networked with another person, the system will dynamically adjust the mix to add more instruments and liven up the emotions. Thus, being alone is still playful but noticeably more quiet and solitary.
That’s a very quick and dirty explanation, but gives a rough glimpse of how every single scene was approached. 100% bespoke and custom, with no pervasive “systems” (like in traditional games where there is “combat” music versus “exploration” music etc).
THE BANNER SAGA
In stark contrast to Journey, the trilogy of Banner Saga games involved the development of a few key systems which were then consistently repeated. For those who’ve played, you know that the games break down into 3 primary types of gameplay: Travel (which is done in a sort of Oregon Trail-esque way), Combat (which is turn-based in the vein of games like Final Fantasy Tactics or even chess), and dialogue/camp.
The last of those, dialogue/camp, is a very simple point-and-click interface where you can read through long (and great) conversations or story lore. We made the decision early on to have zero music in those moments, with the occasional exception of distant echoes of background singing (As in, literally people in the camp are off somewhere singing to themselves).
The Travel mode is relatively simple, and the cues typically either play once or loop as the caravan traverses and the player contends with the pop-up decisions:
Because of the text-based nature of the storytelling, it was decided early on to not make the system too reactive to the player. If everything you do causes musical feedback, the result might feel rushed whereas the goal was to feel methodical.
The final system, combat, was the most complex. We experimented early on with it being highly reactive (which was very technically easy because the gameplay itself is so driven by simple math), but found that it was very distracting. Much like travel mode, if things were too reactive not only did it break the intended vibe, but it also telegraphed information too quickly. The combat centers around two sides squaring off in a very controlled, turn-based way. We formulated an algorithm that we called “vitality” which tracked your team’s damage potential, relative to the enemy’s so that the more dire the battle got, the more dramatic the music got. We could even track (for multiplayer purposes especially) which team is winning and tilt their music towards a more heroic feeling, while the other team’s starts to sound more desperate.
The epiphany was in realizing that this system should be asynchronous to the player’s actual choices. Each battle therefore consists of a succession of musical segments, and when the player’s choice triggers an escalation of the drama, each segment plays to its end before switching. Think of it like a train track where the turnstile up ahead changes in anticipation of your arrival. This allowed a far more steady flow of music with gameplay.
Side note, I’ve given two GDC talks about The Banner Saga which have been uploaded to Youtube:
ASSASSIN’S CREED SYNDICATE
The musical systems in Syndicate are so varied that it’s impossible to try and cover them here. However, fortunately, GDC made my detailed post-mortem talk about it (in collaboration with the wondrous music supervisor Christian Pacaud) available so I’ve posted it below.
I want to focus for the moment on combat because it was one of the most intricate systems in the game. For those who haven’t played an AC game, combat is rather constant and it’s also often player-driven; by that I mean, it’s an open world game so the nature of combat is determined by player choices such as how long they wish to fight, and against what sorts of enemies. Because of this, there is enormous variability in how it can play out and posed a huge challenge musically.
First off, we developed a system for analyzing how much danger the player is realistically in so that any fight deemed insufficiently threatening wouldn’t trigger music at all. The nice thing about this approach meant that as you got better weapons or leveled up your character, combat music scaled alongside you: weaker enemies stop triggering music, etc. The main reasons for this were to ensure that it always felt emotionally meaningful (there are no stakes if it’s heard a thousand times), and also because easy fights typically finish so fast that the music could barely start an intro before it’s needing to end.
Second, we broke the hour+ of combat music into 3 large chunks that were triggered based on your narrative progression. This choice stemmed from my own personal grudge against open world games where my participation in the story always felt irrelevant to the music’s progress. Fighting an enemy in X location was the same regardless of if I was a fledgling beginner or badass on the cusp of the finale. So in order to address this problem, we created banks of music for each 1/3 of the story, as determined by your progress through the core missions. The result is that if you wander off and do side missions for an extensive period of time, the music won’t evolve without you. The evolution itself consisted of the orchestra getting gradually larger plus the music getting more and more dark and complex.
Within those banks, the music followed a series of behaviors based around the progression of the fight itself. You can picture it almost like a highway with periodic off-ramps. The system would analyze the fight and if the player appeared to be safe, a “cool down” would begin. This allowed for the fight to re-escalate in case more enemies are triggered (A very common occurrence in AC!), or properly end if you indeed are safe. For each of these segments, there was a variety of alternate versions to stave off repetitiousness.
On top of it all, the player has the choice between either of the two twin characters: Jacob or Evie Frye. If the former, violin solos dominate the melodic lines of the music, and if the latter, cello. Each piece was written and recorded to feature those solos, but such that you would never hear both at once (Except on the soundtrack album).
Here below is glimpse of gameplay Kotaku uploaded that perfectly captures a snippet of this system (note the player is Jacob and hence prevalence of violin solos, and also that it takes place towards the end and thus the final bank of combat music):
Very likely, no one reading this has heard of this game. It sadly went largely unnoticed but it holds a special place in my heart. The game was made by Tale of Tales, a duo who have for many years challenged the developer community by always making risky, thought-provoking games. Even without wide commercial success, their games have exerted a lot of influence and Sunset was my joyful opportunity to finally collaborate with them.
I mention it here because it’s the only score I’ve done that is, in effect, 100% optional for the player. In Sunset you play a housekeeper in a fictional South American country amidst an early 1970s political revolution, and throughout the apartment are vinyl records that can be played. I had to write a ton of music which all felt like it was by different artists with the full awareness that a given player might hear none of it.
It was actually a fascinating and wonderful challenge and because of its uniquely interactive approach, I simply had to mention it here.
And now I conclude with what is by far the most technically challenging score I’ve ever attempted: Flavourworks debut game, Erica. I began this article by referencing that oft-asked question about the difference between film and game scoring; well, it turns out that a game which exists at that intersection somehow multiplies their respective difficulties.
Erica is a Full-Motion Video (FMV) game and as such utilizes traditional sets, actors, cinematography, props, etc. In other words, it looks exactly like a proper film, but is very much an interactive game. And not in a superficial way: it has a variety of gameplay mechanics and a deeply branching narrative threaded with a huge range of variables for the system to track and pay off for the player.
And ALL of it had to somehow be dealt with in the score! Every single scene was scored individually (similarly to Journey) but knowing that at every second the player has options which dramatically alter the emotions or psychology.
Think of how dialogue is often handled in games. A good example might be a Bioware RPG like Mass Effect. The player is presented with a dialogue tree and is free navigate in their own time. A system like this typically has either no music, or a bed of unintrusive ambience while the scene plays out. Even a game like Telltale’s The Walking Dead, which gives you a time limit on your major decisions, does so with a relatively ambient musical accompaniment.
Now compare that to a film. How often have you seen a suspense / thriller where the music is underscoring every single line or even glance of the eyes of a character. Watch a scene like this one from Paul Verhoeven’s seminal Basic Instinct to see just how much the music can capture moment-by-moment (including purposefully falling silent in the middle):
Erica aspired to something far more like Basic Instinct than Mass Effect. If only because watching real human beings play out these scenes made it FEEL very much like a movie, mere beds of ambience would’ve really undercut the dramatic potential of the storytelling.
Because of this, every scene became a web of complex interwoven cues based on every little choice of the player, while themes played out the story arc on the macro-level. Even though the score is barely over an hour of music, its in-game implementation consists of something like 300 total cues, tightly scripted to fit each scene. Rather than a wall of text describing it all, I have here a video showcasing all the endless minor details stitching together to form just a single scene:
Hopefully, as a very basic primer on interactive music in games, this stirred the imagination bit. One of the primary reasons I love composing in this medium is that I firmly believe our collective best work still lays ahead. Games are still in their infancy, both expressively and technologically, and as more is discovered everything I wrote above will appear more and more rudimentary.
It’s a true honor to be even a small part of the advancement of that frontier.