LCL

LOST CHOCOLATE LAB

//Audio Implementation Greats 1

Implementation Greats Contents | Implementation Greats 02

It's a great time in game audio these days. After steeping in the current console generation, several examples of best practices in audio implementation have been exposed through articles, exposes, and video examples. In an attempt to overview some of the forward thinking front runners in the burgeoning field of Technical Sound Design, I've been pulling together examples of inspirational audio in games as a way to highlight innovative techniques and the process behind them.

Crackdown - Realtime Worlds
-Realtime Convolution Reverb through Ray Tracing

Crackdown One area that has been gaining ground since the early days of EAX on the PC platform, and more recently it's omnipresence in audio middleware toolsets, is Reverb. It has become standard practice to enable Reverb within a single game level and apply a single preset algorithm to a subset of the sound mix. Many developers have taken this a step further and created Reverb regions that will call different presets based on the area the player is currently in, allowing the Reverb to change based on predetermined locations using predefined Reverb settings. Furthermore, these presets have been extended to area's outside of the player region so that sounds coming from a different region use the region of the sound origins settings in order to get their Reverb information.

Each of these scenarios is valid in an industry where you must carefully balance all of your resources, and where features must play to the strengths of your game design. When Realtime Worlds, with the help of Microsoft Game Studio, began the overarching scope of sound design for their Xbox 360 release title Crackdown in 2007 they set out to bring the idea of realtime convolution Reverb to the front line.

"When we heard the results of our complex Reverb/Reflections/Convolution or "Audio-Shader" system in Crackdown, we knew that we could make our gunfights sound like that, only in real-time! Because we are simulating true reflections on every 3D voice in the game, with the right content we could immerse the player in a way never before heard."-Raymond Usher

So, what is realtime Reverb using ray tracing and convolution on a per-voice implementation?

Here's a quick definition of Ray Tracing as it applies to physics calculation:
"In physics, ray tracing is a method for calculating the path of waves or particles through a system with regions of varying propagation velocity, absorption characteristics, and reflecting surfaces. Under these circumstances, wavefronts may bend, change direction, or reflect off surfaces, complicating analysis. Ray tracing solves the problem by repeatedly advancing idealized narrow beams called rays through the medium by discrete amounts. Simple problems can be analyzed by propagating a few rays using simple mathematics. More detailed analysis can be performed by using a computer to propagate many rays."source

On the other side of the coin you have the concept of convolution:
"In audio signal processing, convolution Reverb is a process for digitally simulating the reverberation of a physical or virtual space. It is based on the mathematical convolution operation, and uses a pre-recorded audio sample of the impulse response of the space being modelled. To apply the reverberation effect, the impulse-response recording is first stored in a digital signal-processing system. This is then convolved with the incoming audio signal to be processed."source

So what you have is a pre-recorded impulse response of a space being modified (or convoluted) by the Ray Traced calculations of the surrounding physical spaces.
What this allows the sound to communicate in realtime is a greater sense of dynamics as sound is triggered from a point in 3d space.

Article: The Audio of Crackdown
Podcast Interview with MGS Kristofer Mellroth
(Starts at 12:00 in)

Bonus points: The ability to change the audio settings for small speakers which tweaks the distance falloff.

See video example of Realtime Reverb Debug @ 4:15

Also see: Prototype - Runtime Reverb and Filtering

Implementation Greats | Implementation Greats 02