StarForge is a sci fi survival sandbox. Hunt to eat, dig for resources, craft many items, build a fort, and fight enemies in order to survive! Do this alone, or with other players, in a fully infinite procedural world.

Post news Report RSS Tech Feature 012: Audio Generation

A Thursday Tech Feature article detailing how a sound effect goes from an idea to an in-game manifestation.

Posted by on

Alright, so today we’ll be discussing a critical, but oft-afterthought component of any game, and that’s in-game audio. A lot of developers put a lot of emphasis on making a game look visually stunning, but then throw in some last minute sound effects and don’t do a very good job with ambience (if there’s any at all). How many times have you played a game where you saw an amazingly rendered, powerful looking gun, that would command a presence, and then when the player fires it sounds like a potato cannon? It leaves one feeling underwhelmed, and under-immersed. But how does one make those bone-rattling, high energy, awe-inspiring sounds and atmosphere exactly? What’s the process look like? What’re some of the technologies used to create these effects? These are some of the questions that we’ll be covering today. First, we’ll identify and explain some of the tools being used. Most of the work is created within a software called a Digital Audio Workstation, or DAW for short. For example FL Studio, Ableton and Cubase are all very popular DAWs today, all performing the same goal of being the interface between creator and content. DAW’s allow all the other supporting software to interact with the project file. They also have a channel mixer, which controls the volume to multiple channels with which all the instruments can be individually or collectively routed to. This allows the producer to achieve a clean mix, so that the user can clearly and optimally distinguish the sounds. Now, there is a lot more that can go into cleaning the mixture of sounds up, but that is another topic entirely.

The software actually used to create the sounds are called Virtual Studio Technology, or VST for short. These come in an infinite range of forms, and fit into one of three types: VST Instruments, VST Effects and VST MIDI Effects. VST Instruments generate audio. They are generally synthesizers or samplers, which we’ll cover later in the article. VST Effects process, rather than generate audio, and perform functions such as adding reverb, or white noise or some kind of filtering of the sound being generated. VST MIDI Effects process the MIDI message (for more information on what exactly MIDI is, and how it relates, check out this wiki link: and route the data to other VSTs.

The DAW known as Fl Studio 11 with the VST known as Razor, in the foreground

So that covers the basics of sound generation, but how are sound effects made? So let’s say you want the sound of a hammer hitting metal, but how do you make it sound good, and not flat and boring? Through the use of a technique called layering. The audio producer would look for a noise that sounds punchy and mostly mid-range frequencies. Then they would look for a noise that would be similar to that, but reflect much higher frequencies, allowing it to achieve that ‘brightness’. Another noise would be found for the metal, and yet another to bring about a certain texture or quality that would complement the other noises. When that is complete, the producer needs to be mindful to not simply have a bunch of hammer noises going off, but has to find a way to blend all these slightly different sound samples together, and he does that through a process known as compression. It essentially brings decibel values closer together or further apart for the individual sounds, so that they feel ‘glued’ together, and are less distinguishable to the human ear (or more distinguishable, if so desired).

Some Post-Processing and Audio clean up being shown

Now, without getting too in-depth in all the points and processes of audio generation, we’re going to move onto how one actually gets the audio to function in-game. In the case of Starforge, we use the Unity3D engine, and it can be interfaced via a middleware called FMod. FMod is an audio engine that can import sounds into the game, and help modify and generate audio in real-time. What it assists in doing is that it takes the sound effects and helps modify them in real-time due to changing game conditions. For instance, if one were standing in a warehouse and fired off a gun, it would sound different, than if one fired that gun off outside, due to the reverb and echo of the space. It also helps randomize the pitch of the gun sounds a little bit, to make it more realistic, and other such subtle effects. In conclusion, the audio generation process involves many layers of several sounds in order to achieve a desired effect. The sounds can be pulled from an audio sample library, modified to taste, and used that way. The sounds required for layering can also be entirely generated via the use of VSTs and affected as such. They’re then put through a post-processing treatment, to adjust volume levels and frequency cut offs to ensure the clearest sound possible is generated. After all that, it gets piped into the game via middleware applications such as FMod, which can further modify and tailor the sound for specific contexts within the game.

-- Stephen "Hatchling" Smolley - See more at:

Post a comment
Sign in or join with:

Only registered members can share their thoughts. So come on! Join the community today (totally free - or sign in with your social account on the right) and join in the conversation.

Follow Report Profile
Send Message
Release date
Game watch
Post news
Related Games
StarForge Real Time Shooter
Related Engines
Unity Commercial
Related Groups
CodeHatch Developer