Types of Music within Games
Composers can use music in many different ways within games to achieve the desired emotional effect. In this section we begin by defining the categories of music used within video games. In the next section you’ll learn about the function of music in games.
Sometimes as composers we’re adding music to support the game on an emotional level (extra-diegetic). At other times we’re adding to what the avatar of the player might be hearing as part of the game universe. It’s useful to define these different types of music in terms of function.
Extra-Diegetic Music (Underscore)
Extra-diegetic music, or non-diegetic music, refers to music that is added to a scene purely to enhance the emotions that the player should be feeling. This is commonly known as underscore. The musical ensemble or device that plays this music is not established to the player in the game. Its existence is not even inferred in the game world. Underscore works on a subconscious level to bring story elements together in its use of themes and motifs, as well as to intensify the emotional context of a scene. It also may inform the player or viewer of something that might be off-screen. Extra-diegetic music also helps with setting the stage by implying a specific time in history or a place within the world. With extra-diegetic music, the viewer does not expect to see the instruments on the screen playing the music.
It is commonly said that the best film scores are not noticed by the audience or viewer. More obvious (and clumsier) scores take the viewer out of the experience of watching a movie by bringing what should be an unconscious element to the forefront for the listener.
A classic film example is John Williams’s two-note motif from Jaws (1975). Whenever there is impending danger of the shark, the audience hears this motif. Later in the film, even though the shark is not on screen, this motif builds tension for the viewers because they expect to see the shark soon. Williams is a master film composer, as he actually uses this motif to teach the audience that this music equals an impending shark attack. Later in the film he breaks this mechanic by not playing the two-note motif before the shark’s entrance; when the shark appears, it’s one of the most terrifying moments in the film. Williams uses music to trick the audience into believing something based solely on his use of music, increasing the horror of the film.
In almost any modern video game, we hear extra-diegetic music enhancing the emotional underpinning of the story. In the game Red Dead Redemption (2010), for example, we hear an Ennio Morricone–inspired score as we follow the adventures of a former outlaw in the American frontier. The interactive score changes dynamically as the player goes from scene to scene, and from plot point to plot point. In the game Bioshock (2007), Garry Schyman composes music for an underwater city engulfed in chaos using aleatoric music techniques, along with solo violin passages creating a terrifying but beautiful collage of themes.
Diegetic Music (Source Music)
Diegetic music is music that a character would hear if he or she were actually in the game world itself. In films, we usually refer to this as “source music.” If we see someone on screen playing a violin, we expect to hear the violin. The function of diegetic music function is to enhance the player’s experience. Typically it’s used to increase the realism of the simulated world.
In Bioshock Infinite (2013), there are moments when we see various musical ensembles, including a barbershop quartet. When we see the barbershop quartet on screen and hear them singing, it’s an example of diegetic music.
Another example is from Mass Effect 2 (2010). When the player is standing outside of a nightclub, he hears the music from inside. This is music that the player would be hearing as part of the world itself.
Games like Grand Theft Auto V (2013) and L.A. Noire (2011) are 3D simulations of another world. Both games revolve around a driving mechanic where players drive different vehicles through this simulated world. These vehicles have radios, and players can change the radio station, so the music changes as they’re driving through this world. This is another example of diegetic music.
One last example of diegetic music within games occurs at the beginning of Assassin’s Creed III (2012), when the player is sneaking around a theater while an opera is being performed. We see the actors and musicians, and the music is coming from the universe itself.
In all of the previous examples, the music placement in the speakers is also very important to simulate where the music is coming from. We use real-time panning, equalization, and reverb techniques to simulate the position within the 3D space. As the player moves around this 3D space, the instruments or devices must pan dynamically to simulate the fact that this is a real place.
In many instances, diegetic music is licensed music. Licensed music has been created previously by an artist, and the game publisher must obtain the rights to use this music within a video game. On large games, a music supervisor may coordinate the licensing process by obtaining (and paying for) the relevant rights to the piece of music.
While playing a game, if we see a jukebox or some other music-making device or musician on screen, then we’ll want to hear the music that it creates. The realism of that world or simulation would be broken if we didn’t hear the sound. As composers, it’s critical to be wary of destroying the illusion of a world for the player.
Occasionally the distinction between diegetic music and extra-diegetic music becomes blurred. For example, we might start off listening to diegetic music, but then as the story progresses the same theme can be heard holding emotional elements of the storyline together.
Music as Gameplay
The third classification of music in games is music as gameplay—that is, when the player generates the music in real time as he or she plays the game.
One might consider all interactive music in games to be “music as gameplay,” but the difference here is that the game system is reacting directly to the actions of the player. Most interactive music systems have an indirect relationship to the underlying music system. For instance, when a player makes choices in a game that then affect the state of the character (e.g., explore or combat), the music would change based on those indirect choices. This is not an example of music as gameplay.
Games that use the music-as-gameplay paradigm typically operate on a lower level of detail than state changes in the music. If the game player makes an action that triggers a sound in rhythm or creates a sequence of notes, that would fall into this classification.
Game developers work with composers to devise an overall music system that complements the gameplay, defining the rules of how the music will play on a note-to-note or phrase-to-phrase level.
There are several different gameplay scenarios in this classification, including simulated performance, player-generated music, and rhythm action.
Games like Rock Band (2007) and Guitar Hero (2005) use musical controllers to simulate the effect of being in an actual band performing the music. These games typically use licensed music from popular bands and artists. The music in these games is played back depending on how well the player performs in the game. The better the player performs, the closer to the original licensed song the result will be.
In some games, the player creates music dynamically while playing the game. In PaRappa the Rapper (1996), for example, the player is able to direct the lead character to rap. In what is essentially a rhythm action game, the player presses the control in rhythm in a specified order to get the lead character to rap in real time.
Another example of player-generated music is found in the first-person shooter/rhythm action game Rez (2001) from game designer Tetsuya Mizaguchi. In this game, the player creates music dynamically by shooting down geometric shapes in rhythm with the music, triggering musical notes and patterns.
Last, the game Bit.Trip Runner (2010) is a 2D platformer where the avatar runs from left to right at a constant pace. As the player jumps or collects coins in the world, a musical phrase or set of notes begins to play.
Just as we sometimes blur the lines between diegetic and extra-diegetic music, so you could make a case that games like Portal 2 (2011), which allow you to manipulate physical objects in a 3D world that make noise, feature player-generated music.
The last category of music as gameplay is rhythm action games. Dance Dance Revolution (1998), Amplitude (2003), and Space Channel 5 (1999) are all examples of rhythm action games. In these games, players listen to the rhythm of the music and then synchronize their actions by either dancing or hitting buttons on a controller in time with the music to gain rewards.
Player Customized Music
Players may also choose to import their own music into a game. The original Xbox 360 release mandated that every game must be made in such a way that you could turn off the musical underscore and replace it with user-chosen content. If the player wanted to hear John Williams’s Superman (1978) theme while he played Halo, for example, the system would allow for that.
To enable users to bring their own music into a game, systems would allow the players to encode music from a CD or flash drive directly to the memory of the console. Then, while playing a game, users could select to hear their personalized music within the game.
Entire games have been built around customized music, including Audiosurf (2008) and Vib Ribbon (1999). These rhythm action games create dynamic game levels built around the music that the player chooses.
Player-customized music is also popular in racing games. The 2012 release of SSX, which is a snowboarding game, allows players to import any music that they want into the game. The game then uses various digital signal processing (DSP)—filters, beat-matching delays, reverbs—remixing on the fly during gameplay to augment and enhance the custom music.