To build a language to use when describing video game music, it’s important to understand some of the unique characteristics of the medium of video games. There are many important differences between linear experiences like film and nonlinear experiences like video games. These differences affect how the music is conceptualized, composed, and synchronized in video games. Some of the prominent differences include the following:
- Type of experience
- Length of experience
- Number of plays
- Game mechanics
- Pacing, synchronization, and flow
- Multiple story paths and repeatability
Video games use music in different ways, including on a purely emotional level to increase empathy in the player, or as music that can be heard by the player in the game world itself. The important classifications of music within games include these four categories:
- Diegetic music
- Extra-diegetic (or non-diegetic) music
- Music as gameplay
- Player-customized music
Music can be a compelling and useful device to bring players into the game or to enhance the game’s storytelling aspects. Music within video games can have many different functions:
- Setting the scene
- Introducing characters
- Signaling a change in game state
- Forging an emotional connection to the player
- Enhancing narrative and dramatic story arcs
To create successful music for a game, the game development team works with the composer to conceptualize the music. Conceptualization helps define the stylistic, creative, and functional goals of the music before the actual composing begins. The following steps are designed to help you keep focused as you determine which direction is the most effective for your game:
- Gather and assess materials.
- Prioritize primary music objectives.
- Create an asset list.
- Define interactive elements in the score.
- Create a supporting audio style guide.
- Create the audio design document.