The Language of Music Storytelling in Games
This material doesn’t require you to have a music composition degree, but rather is a top-down overview of the fundamental processes and terminology that drives the creation of a game score. You’ll learn about game music types, music functions within a game, and the building blocks for critical analysis of interactive scores.
What Makes a Video Game Unique?
To build a language to use when describing video game music, it’s important to understand some of the unique characteristics of the medium of video games. This chapter begins by breaking down the anatomy of a video game, comparing and contrasting elements to linear media. These differences affect how the music is conceptualized, composed, and synchronized to work within a game.
There are several key differences between scoring for linear media like film and scoring for video games. If we look at a direct comparison between films and games, as you’ll see in Table 1.1, you’ll begin to discover and identify some of these key differences.
Table 1.1 Key Differences between Films and Games
Attribute |
Film |
Console Game |
Type of experience |
Passive watching |
Active interaction |
Average length |
2 hours |
10+ hours |
Number of plays |
Usually once |
Many times |
Structure |
Linear: one beginning, one middle, one end |
Nonlinear: multiple outcomes and evolving storylines |
Average amount of music |
1 hour |
2–3 hours |
Passive versus Active Interaction
Video games require the player to be actively involved so as to make decisions based on the action that is occurring on screen. This active interaction is the most important element that distinguishes the medium. Players are actively involved in determining the outcome of a game, whereas in linear media like film there is no interaction; instead, viewers watch passively.
This interaction between player and story in video games creates a reactive feedback loop, with each one affecting the other. The level of interaction is determined by the rules and mechanics of the game and is usually controlled by the player through a game controller or a keyboard/mouse combination.
This active interaction between the game and the player also affects how the music must change and react to player decisions. The music must be written in such a way that it is adaptable based on the player interaction. Throughout this book you’ll learn about different ways to compose adaptive and interactive music compositions for video games.
Variable Length of Experience
Length of the gameplay experience is one of the most important aspects in determining the amount of music that must be conceived and written for a game. Video games vary greatly in the length of experience compared to film. Furthermore, each game genre has a length that is most suitable for the style of play, whether it’s puzzle solving in a game like Myst (1993) or defeating an alien invasion in a game like Halo (2001).
Casual games (Tetris, 1984; Bejeweled, 2001; Diner Dash, 2004) that are played from beginning to end might be only 2 to 3 hours in length, whereas a massively multiplayer online roleplaying game (MMORPG) like World of Warcraft (2004) might have a play experience totaling more than 50 hours. Typically AAA (pronounced “triple-A”) console titles for Xbox or PlayStation have a play experience that lasts 10 or more hours.
Table 1.2 summarizes the differences in the length of play between different game types.
Table 1.2 Length of Music in Games
Game Type |
Play Experience |
Average Amount of Music |
Casual game |
2–3 hours |
15+ minutes |
Console game |
10+ hours |
2–3 hours |
MMORPG |
50+ hours |
15+ hours |
The time it takes to play a game depends on many different factors, including length of the story, game variability, and the experience of the player. These additional factors are discussed throughout the chapter. In some very large games, players sometimes play for as much as 20 to 30 hours per week!
Many games today also have expansion packs that allow the game to grow by extending the player experience with new storylines and additional content. These expansion packs may also increase the amount of music in a game. Popular games that include expansion packs include Angry Birds (2009) and Bioshock: Infinite (2013).
Number of Plays
The play experience in games is significantly longer than the experience with most linear media. Consequently, players often don’t finish games in one session. Instead, it typically takes many sessions for a game player to finish a game.
This has direct implications for the music. How do we approach the interruptions caused by players stopping and starting in our music? Is there a way to bring the player back into the story more seamlessly, reminding the player where he or she left off?
A composer can use several different approaches to enhance the storytelling in the game between interruptions. For example, composers often use thematic material to tie the story together by representing characters or places in their music. The “Music Conceptualization” section of this chapter discusses this in more detail.
Game Mechanics
In addition to a storyline, video games have specific game mechanics that make them different from film. These mechanics or rules define the play experience and dictate how the player interacts with the game system. For instance, in the early arcade game Space Invaders (1978), the basic gameplay mechanic is to shoot the impending alien march while avoiding getting hit by the enemy’s lasers or having the aliens reach your home world. Put even more simply, the mechanic is about winning or losing a specific game level. The player’s skill level determines whether the game continues or ends. Other examples of game mechanics include solving puzzles, taking turns, racing against a clock, beat-matching, and many more.
Game mechanics are a system of rewards and challenges that a player faces when entering the game. Game music systems need to be aware of game mechanics and, in turn, enhance the play experience by supporting these mechanics.
Pacing, Synchronization, and Flow
Video game players typically drive the storyline at their own pace. Players can move quickly or more slowly through a level, depending on their skill level. Since a composer cannot write a customized score for every individual player, he or she may instead write an adaptive score that takes the player’s skill level and pacing into account. This way the composer supports the same emotional pacing for each player. For example, in an open-world game like World of Warcraft (2004), the player at any given moment may decide to go to places within the world like Elwynn Forest or Ironforge. These decisions affect which music will play and determine the transitions that happen to get us from one piece of music to the next.
Unlike in linear media, where a composer can synchronize the music to a specific frame number, the game storyline is driven by the player. Synchronization in music is achieved by following changes in emotional context. These changes then direct how the music might play, in the same way that a conductor cues the woodwinds in a symphony.
The interactive music system in a game can take into account many different factors besides location, including the player’s health, proximity to enemies, various artificial intelligence (AI) state(s), the length of time the music has been playing, and so on. These variables can help change and adapt the music so it is synchronized to the events that unfold for the player.
Multiple Story Paths and Repeatability
When you watch a film, the experience is static—it doesn’t change from viewing to viewing. In games, however, the narrative and dramatic arcs are based on real-time choices made by the player. This may mean that there are multiple story outcomes.
Because of this possibility, the music must follow the player’s decisions throughout the game to support the emotional context for the scene or level at any given time. The music must change dynamically based on these decisions and conditions, which requires composers, music editors, and game designers to think differently when approaching the composition of the score. For instance, in the game Mass Effect (2007), the player makes decisions about which characters to support throughout the story. Characters that aren’t supported may actually die during the game. Since these characters have musical themes attached to them, we need to be aware of how these themes are shaped and evolve over time based on the player’s decisions.
When games have multiple outcomes, they can be played through multiple times. This increases the chance that a player might hear music multiple times. Many composers use the interactive music techniques outlined in this book to minimize the repetition. For example, one technique is to play back a music cue in a different order. A composer might also write multiple introductions to the same piece of music so the player will hear it begin differently each time it plays. More of these techniques will be reviewed in later chapters of this book.
Technology
Composing for video games is ostensibly reliant on the underlying technology used to play back music within the game. Hence interactive music systems are tied to advances in this technology. Composers who are interested in creating music for games need to be fearless when it comes to learning about new technology because they are often asked to learn a new music format while they are writing in it.
Mobile and web games typically have greater memory and voice constraints than console games, making composing for these platforms very challenging. Conversely, a game like Batman: Arkham City (2011) uses the audio middleware engine Wwise by AudioKinetic. Wwise is a very advanced interactive audio and music engine. Even so, when composing for this system, the audio team needs to understand its strengths and limitations to use the system effectively.
Recent technology advances such as cloud storage and remote rendering for games are rapidly changing how games are delivered to consumers. In the future, therefore, game developers may have fewer limitations in terms of technology.
Although it can be a huge benefit to composers to understand the technology and score design that will ultimately be integrating their music into the game, it isn’t essential knowledge. On large games, an entire team of people may work on creating the music for a game. In these circumstances, getting the right creative fit may be more important than having a composer who knows about the technical and adaptive techniques that will be implemented in the final game. The team would include interactive music specialists who take the raw materials from the composer to create the adaptive music components. In this scenario, composers may never have to deal with formats other than handing off their Pro Tools sessions (or similar digital audio workstation [DAW] files).