Video games and procedural music, a perfect marriage?

Video game music has long been dependent on technological developments in the medium. Originally quite basic, it punctuated silences with sound loops programmed in MIDI format and limited by the cartridge format at the end of the 80s. With the innovations that followed, video game music wanted to get closer to that of the cinema by becoming suggestive: it indicates the emotional temperature of a sequence and accentuates what the player feels with the joystick (or mouse) in hand.

We then witness the beginnings of adaptive music with the creation of the iMUSE sound engine (for Interactive Music Streaming Engine) which will be applied to the famous point & click adventure game The Secret of Monkey Island in 1990. The objective is thus that the music reflects the gameplay and player actions on screen. If adaptive music has now become a standard for many contemporary productions, it hides a chimera which at first sight could oppose developers and composers. What if the music was generated entirely via an algorithm based on the actions of the player? We then speak of procedural music.

In the same category

After Microsoft and Activision, Sony buys Bungie for 3.6 billion dollars

The musical hybridization of No Man’s Sky

As Paul Weir, sound designer for the game No Man’s Sky, reminds us, the procedural generation has been present in the creation of video games for several decades, but has not yet left its mark on the world of audio and even less music. No Man’s Sky allows the player to explore mathematically generated galaxies and planets for a totally individual experience. More than a simple audio synthesis, the sound design is therefore generated in real time according to all the systems that the game puts in place and the events that it causes.

But what about the music? Post-rock band 65daysofstatic composed an entire album for the game, which was then broken down into more than 2,500 elements (drum loops, melodies, pads, guitar riffs) before they could adapt the music to every situation via the well-known modular synthesis system. If it is not generated procedurally, the music here takes up the immense challenge of adapting to gameplay situations generated in real time, and therefore unpredictable.

Mini Métro, music integrated into game design

In Mini Metro, the player designs a metropolitan network and its stations for a growing city. Despite its minimalist graphic paw, the game has a dizzying systemic depth to which is added a complex sound engine. The composer Disasterpiece has thus designed a model that marries ambient music and sound design entirely dependent on the events of the game. Procedural composition constantly juggles between the musical intention and the systems that generate its variations.

The composer went further by setting up with the creators of the game a so-called quantization system, which aligns all the events of the game on the different beats of a bar (like the game Rez released on PlayStation and Dreamcast in 2001). Thus, the notes and tones change, just like the tempo which can thus slow down or accelerate according to the situations of gameplay.

If by its nature, this purely generative system prevents from offering a singular work, some have tried it like the composer Max Duggan who gathered the fruit of a multitude of parts of Mini Metro in an EP entitled Reflections of Mini Metro.

The future: OpenAI and its Jukebox algorithm

Like No Man’s Sky and its procedural planets, the music could thus offer each player a different and unique sound experience each time. As games become more and more complex with so-called “emerging” gameplay and narrative situations, the result of the confrontation between different game systems, music could thus participate in this singularization of the experience. While the concept of music generated entirely by computer dates back to the 1950s, innovations in terms of machine learning have thus enabled certain researchers to develop artificial intelligence capable of learning and playing their own musical compositions.

Recently, it was OpenAI’s AI Jukebox that offered us a convincing, but still embryonic, glimpse of what procedural music could become. Jukebox is thus based on scores, and therefore the purely technical dimension of the composition, while drawing its inspiration directly from songs or even a musical genre in its own right, such as country, rock or even pop.

It remains to be seen now if this model is desirable, and this for a very simple reason: how to keep a certain artistic intention when the music of a video game is generated in real time by an algorithm? If the Mini Metro system works, it’s because it’s inherent to the game design, that is to say, it’s consistent with the gaming experience its creators wanted. If procedural music seems to blend perfectly with the video game medium, it remains to be seen to what degree composers and designers will use it.

Hugo Clery.

ttn-4

Bir yanıt yazın