• giuseppe palazzo, sound designer, audiokinetik, Wwise, non-linear, sound design, interactive music, adaptive music, middleware, audio engine, unreal, unity, game engine

Sound design for video game is non-linear. Typically a game project requires two tasks to be completed: sounds must be designed in a DAW and a sound engine (middleware) must be programmed so that those sounds can be incorporated into the game’s interactive environment. The modern systems frequently use positional audio, often with hardware acceleration, and real-time audio post-processing which can also be linked to the 3D graphic development. Multiple calculations can be carried out by considering the internal state of the game, this allows realistic sound dampening, echoes, doppler effect, HRTF, etc…

In this video I explain how I made the sound effects and foley for Bash Box game

Synthesis: granular subtractive designed with Siel Cruise oscillator, digital noise and wave
Foley & field rec: Zoom H1, SE Electronics 2200a, Samson C01
Editing: Izotope RX 5, Logic Pro X
Game scoring: Logic Pro X as DAW and many work tools, realistic and virtual instruments
Mixing: mix in the box whit Logic Pro X
Audio engine: Wwise
Game engine: Unity

HRTF is acrconimon of the a head-related transfer function. It’s a response that characterizes how an ear receives a sound from a point in the space

Knowing physical laws to design the sound of any object that moves in space is very important, especially in sound design for video game where new virtual reality technologies are developing to make the user’s experience more immersive

I made this test by grabbing a video from a Youtuber and replacing the audio track. Volume, pan, filter, and reverb reflection automations have been designed with Logic Pro X

Use headphone for listen and enjoy a really UX