Today I want to explain my process for the music in Life on Mars. How this process came to be is quite the coincidence, as I discovered the software I make the music in at the same time as I came up with the idea for this game and I immediately felt that this was a perfect combination.
As I’m not a composer by any stretch and have only a vague understanding of music theory, I needed something that was easy to use and would take care of as much of the behind-the-scenes stuff as possible. I previously worked with Reason by Propellerhead, a very complex composing software that I was barely able to use to a small extent. But, to get to the point, in 2013 I discovered a small, compact app for iOS called Figure, which was also published by the aforementioned Propellerhead. The interface looked slick and the touch screen allowed for interesting interactions.
The interface of Figure, showing the three layers.
To give you a quick impression, here’s a little rundown. Every track consists of three layers, one for percussion, one bass layer and one for the lead instrument. For each layer you can select from a variety of instruments. Then you can play around with the layers, get a feel for it by touching on the colored rectangles (the output also depends on where you press the rectangle, which can be used for some cool effects). When you’re ready, press on the record button and start jamming.
When I was still learning to use the app, I started by just randomly hitting notes or swiping my finger over the screen, which sometimes lead to some cool psychedelic tunes like the one below.
I love using Figure, but it has one big limitation: tracks are very short. You have some basic control over how long the final track will be, by changing the amount of bars or the speed of the song, but in the end, most of my songs are between 10 and 20 seconds long. This may sound pretty short – it is – but this might not be a problem for the game I intend to make.
Let me explain: instead of having a normal soundtrack that plays certain songs at certain situations, I’m aiming for a more ambient feel, a backdrop of sound that enhances the feeling of being in a living world. This will be achieved by using something called 3D audio, which means that each sound or track has a position in the game world, so that the distance of the player to that position determines the panning and the volume of it. Often this is used for sound effects like cars driving by, but I want to use it for the whole audio. As the game plays in a colony on Mars, there are a multitude of shops begging for your attention by playing music, there are people walking around, chatting, there are – as mentioned before – cars driving by, etc. etc. All this will spin a web of sounds which in the end hopefully sound like a living, breathing place.
There’s also a big night club which will be important to the story, but here a lot of different music will be played, with people dancing to it, a light show and so on. My main goal with this approach for the music is to give the feeling that the music played in the game is not something that is played over the gameplay, but something inside the game world. To be more clear, in most games only the player hears the music, it is being played to enhance the situation he is in. In Life on Mars, everyone inhabiting the colony can hear the music, comment on it or be attracted by it. It is meant to belong in this very colony.
My big challenge will be to implement these different tunes in a way that they don’t become annoying after the third repeat. No song should overstay its welcome.
To wrap things up, here is a video of how a track looks inside the app. You can see how I moved my finger across the screen to make it. Below are also some more examples from the 45 tracks I have composed since 2013. I’d love to hear your feedback!