So much to do and so little time, but alas I swore to continue with this blog. The video below shows a Unity editor mix system for dynamically controlling the volume of grouped sounds via Wwise RTPCs. This builds from previous plug-in work, but now I have an official game project to use it on! [sinister laugh] The game is still very much in development, but it’s coming along.
Real-time audio mixing with Wwise and Unity3D from Roel_San on Vimeo.
While the system provides the ability to easily adjust levels of sounds in the Unity editor, it also acts as an active mix system, receiving method calls in-code that change the levels of various sound groups depending on a defined game-state. This system definitely aided in streamlining the audio pipeline for this senior game project.
Why the need for such a tool? Fine-tuning and iteration. Audio implementation time is oh-so-sweet but incredibly limited. Coupled with a game design that is constantly changing, having a quick method of mixing becomes essential.
One of the main problems encountered while implementing sounds for the project was how to structure sounds so that they are mixed properly. Introducing the Unity engine to that problem gives you yet another point of manipulating your mix. Even after restructuring sound groups and getting the right levels in Wwise, the mix did not always transfer over to Unity as intended and needed to be changed often. Luckily, this tool has alleviated many problems.
Unfortunately, it was only after struggling a bit through the implementation process that I later ran into this gem. After giving it read-through I was reassured that I had taken the right path. Thanks to @Tetley_uk. For those jumping into interactive mixing, check out his presentation(essential reading).
I should do another post on the procedural modal synthesis footstep system soon. Did I mention I love Wwise?
Saturday, March 3, 2012
Thursday, January 19, 2012
Unity and The Interactive Music System
Last fall I provided audio for 6 student game projects, all using the Unity engine. Each game had about a 3-4 week development period so needless to say production was rough. As a result I wanted to create some custom audio scripts to facilitate the audio implementation process and provide a more dynamic audio experience. Since the majority of the development teams were targeting mobile platforms, middleware was not a viable solution. Painful I know.
So although limited in audio features, Unity still has a some neat tricks. It was also nice to see that a fair amount of the HTML5 interactive music system prototype transferred over to these scripts. These scripts were written in C# and can be used on any platform Unity builds to.
For more information on Dark Hack and its designers:
Subscribe to:
Posts (Atom)