Jump to content

sound for games?


kooki_sf

Recommended Posts

  • Members

anyone here familiar with current tools used for designing sound for computer games? or is everything just made in a DAW and then sampled these days? i know of some simple libraries, but I'm wondering if there is anything more integrated, similar to working with the blender/OGRE pipeline.

Link to comment
Share on other sites

  • Members

i suspect that DAWs are commonly used for designing sound for videogames, much like it is for movie soundtracks. i'm not sure of specific programs that are used. it probably varies from company to company.

 

i do know that sampling is often used for sound effects. this video shows the devs who made dark sector and some of their techniques. it's a good example of creative sampling. btw, mangling animal sounds can be quite effective too.

 

[YOUTUBE][/YOUTUBE]

Link to comment
Share on other sites

  • Members

That depends if you have straight soundtracks (like with racing games - 1 song per track) or audio that has to adapt/blend with the scenery (like in Mario 64).

 

The latter's still sampled but you're basically fading in tracks or transitioning from one loop to another so you get a fluid transition.

Link to comment
Share on other sites

  • Members

i'm still not quite sure if this answers your question.

 

my understanding is that audio for a videogame depends on the type of game and how the game engine itself processes sound. it would be different for every game type and game engine.

 

as yoozer mentioned, with a racing game they would use a jukebox type system that would likely play standard mp3s. if it's a game like tron 2.0 or more recently, dead space, they use a very different system. in a game like this, the sound designers create numerous sequences that range from very quiet, calm music to music that's very intense and action packed. the game engine chooses which sequences to play based on how much action the game perceives to be happening and due to the player's proximity to sound emitters. sound emitters are objects within the game world that trigger sounds. these are just two styles of how a game can handle sounds. there are many other models that are used too.

 

i do know that sound design for videogames has become increasingly complex and that there seems to be a lot more programming involved with it. i have no idea what programming languages would necessarily be used. again, i'm sure that it would depend on the game engine itself and the system that the game is being designed for. if there are industry standard programs that are used, i'm not familiar with them, but then, i've only ever done modding as a hobbyist for my own amusement.

 

if you don't find your answer here, this is a resource that you might be interested in, if you're not already familiar with it. it's a site where both professionals, independents and home modders network and provide information on videogame development. hopefully though, someone who frequents these forums might be able to provide you with better answers.

 

http://www.gamasutra.com/

 

edit - i almost forgot to mention that space is a premium and sound can be a resource hog. compressed file formats are common with the sound files for a videogame. usually the sound department is given a specific target number of how much space they're allowed to use on a game (ie 2GB of space). i know that mp3s would certainly be one file type that would be used.

Link to comment
Share on other sites

  • Members

 

Awesome vid, thanks for sharing!

 

 

hey, no problem. i know that i enjoyed it and i figured that others might too. you know, i see something ike that and i'm reminded how much i miss having a sampler in my setup. my next piece of gear really should be a proper sampler and a good field recorder so that i can make fun sounds like that again.

Link to comment
Share on other sites

  • Members

the sound engine is samples played through coded filters. the filters are for spatial positioning and not low pass high pass types of filters. there is some reverb or delay applied depending on the level designers choice of perceived acoustics also.

 

basically you have an audio event that has a "trigger" associated to it. the trigger is either proximity to an area/item or an event that occurs. these audio events have some tags like volume, and panning, and minimum playback length and maximum playback length, and the engine takes care of sample position of the currently playing audio event and keeping track of triggers and whether the audio event is even to be played anymore. the audio events are all fed into a filter with the tag of position of the event from player position, the filter handles the manipulation of volume and position and delay and reverb.

 

the audio events are nothing but straight shot samples or looped sample material. hope that helps you ...

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...