top of page


Archive 2016

2016 Archive

9:00 - 9:15 Welcome and Introduction

Brian Schmidt

Overview of GameSoundCon, Schedule updates, changes and other info

9:25 - 10:25 Introduction to Game Audio: How Games are Different from Anything You've Worked on Before

Brian Schmidt

ESSENTIALS TRACK: This session provides an introduction and exploration into the many ways in which video game music and sound design are fundamentally different from linear media such as television or film. It also covers what to expect when working on a game, and how being part of a game team is very different from being hired to score or do sound design for more traditional media.

9:25 - 10:25 The Music and Sound of Gathering Sky

Dren McDonald

PROFESSIONAL TRACK: A game with no words, no voice, and no tutorial needs to rely on something besides visuals to tell it's story, and in Gathering Sky the audio was responsible for providing this narrative. Using FMOD Studio (and Java), Dren created a dynamic music system with loops, transitions, stingers, and musical sounds to help guide the players through this emotional experience. In this talk he will demonstrate many of the techniques he used, both in the middleware and in the music preparation and recording sessions to achieve a multi-award winning soundscape on an incredibly indie budget.

9:25 - 10:25: VR: VR Audio Workflow

Chris Hegstrom, Jesse Holt

VIRTUAL  REALITY TRACK: Everyone is excited about VR audio & all of the potential new opportunities but what do we needto do to get there? What will the VR audio workflow look like in 5 years time when we’re more confident & comfortable with this new medium? Wwise & FMod are the standards for game audio today, yet they weren’t created in one day. What elements from games, film, live & broadcast can we cherry pick for VR audio & what from those fields will we need to relearn?

VR audio requires a new point of view for recording, designing, integrating, monitoring & mixing
3D sounds. Being able to envision & discuss this workflow will help us arrive at it sooner & start
creating & experiencing sonic VR content the way it was meant to be.

9:25 - 10:25: Interactive Audio Prototyping with FMOD Studio and Unity

FMOD Session

(Liam de Koster Kjaer)Introduction to the FMOD Studio creative workflow. From asset discovery in to creating Events and game testing in a Unity based sandbox, Liam takes you step by step through the process and shows just how easy it is to bring your game audio ideas to life

9:25 - 10:25: Interactive Music (Part 1) – Understanding the Process From Composition to Game Play

Wwise Session

(Robert Brock) Interactive music is a buzz phrase that everyone has heard, but many are unclear about the workflow necessary to create a truly interactive score. To understand the process, you’ll see how a conventional DAW and Audiokinetic’s Wwise software can take a game that is musically void, to one where the player’s actions control the musical soundscape.

Through the process you’ll see how interactive music impacts...

                • compositional process

                • song structure and organization within a DAW

                • mixing considerations

                • file delivery

                • integrating music assets into the game

                • game play

Seeing this demonstration will help you better understand a game studio’s needs and help you make better decisions during the earliest phases of the creative process.


10:35 - 11:35 Essential Game Audio Tech I

Brian Schmidt

ESSENTIALS TRACK: The technology behind game sound, both its capabilities and limitations can have a profound impact on how game sound is created and put into a video game. Knowing these gives the composer or sound designer the ability to set a high bar, but not promise more than they can deliver. This session will cover the essential issues in game audio technology and how they affect what is and isn’t possible when creating game sound and music. Digital Audio, Game Sound Compression, Inside a Game Console, How Console Technology Affects Game Music & Sound Design are some of the topics covered.

10:35 - 11:35 Runtime Sound Design is Funtime Sound Design

Jaclyn Shumate

PROFESSIONAL TRACK: This talk will walk you through the process of using a combination of synthesis, envelopes, LFOs, filters, tone generators, DSP, and a small set of .wav assets to create run-time audio.  We will look at the way sound was created on Peggle Blast and Bejeweled Stars, walk through the workflow, talk about tips, tricks, and lessons learned, and discuss how creating assets this way leads to easily incorporating procedural game audio as part of your sound design process, and what is coming in the future for the exciting world of procedural run-time sound design.   

10:35 - 11:35 VR: Beyond 360: Advanced Audio Techniques for Virtual Reality

Scott Gershin, Viktor Phoenix

VIRTUAL REALITY TRACK: The consumer VR market exploded this year and, with that growth, brought a renewed focus on audio. Developers and storytellers working in VR are now looking to sound designers to help them realize their visions with 3D audio. Making the transition from traditional 360 positional sound to more advanced uses of 3D sound brings exciting potential with it, but can create some creative and technical challenges as well. While there is plenty of available information on 3D audio, it is mostly academic with very little practical insight. Technicolor's Scott Martin Gershin and Viktor Phoenix draw on their experience to share creative techniques, discuss technical tools and review best practices for designing, implementing & mixing 3D audio for VR. The information will be presented in an approachable, yet thorough, presentation that will inspire and inform.

10:35 - 11:35 Mixing and Debugging the Audio without the Game

FMOD Session

( Liam de Koster Kjaer) Liam demonstrates powerful features of FMOD Studio that empower distributed development teams. Learn how API Capture can be used to debug, fix and remix game audio captured from a gameplay session, without needing to be connected to the game.

10:35 - 11:35 Interactive Music (Part 2) - Understanding the Process from Composition to Game Play

Wwise Session

(Simon Ashby) This session invites composers and sound designers for an in-depth look into the tool chest of compositional techniques by examining a variety of Wwise projects and videos from shipped games.

Topics will cover:

                • differences between dynamic and interactive music

                • utilizing pre-rendered audio stems, MIDI and generative music

                • classic interactive music structures

                • methods to reduce repetition (and get the most out of the few megs)

By observing these integration techniques, you’ll be inspired to create your own unique approach for how players are provided musical feedback and rewards, core to the mechanics of any game. 


12:00- 1:00 Composing Interactive Music (I)

Paul Lipson, Formosa

ESSENTIALS TRACK: This session will follow the creation and implementation of an interactive musical score for a major console game. Beginning with how to work with the designer to map game levels to music, the session will focus on the interaction between what's technically possible and the composer's vision for the game, through the planning, recording and production phases.

12:00 - 1:00 Video Game VO: Evolving Techniques and Practices

William "Chip" Beaman (Formosa), Bonnie Bogovich (Schell Games), Kevin McMullan (Line In Audio), Morla Gorrondona (Independent), Michael Csurics (Brightskull)

PROFESSIONAL TRACK: With this panel we would like to introduce, and discuss, topics that are currently relevant to the VO side of Game Audio Development with a panel of leading industry experts from GVAC, the Game Audio Network Guild’s (GANG) professional voice wing.
Planned topics include: Modern Recording and Performance Techniques (ensemble, p-cap, etc...), Working with Independent Studios, Localization, How VR impacts VO, and more.

12:00 - 12:30 Creating Music for the VR-driven “Edge of Nowhere”

Michael Bross

VIRTUAL REALITY TRACK: Composer Michael Bross will discuss his work on Insomniac Games’s “Edge of Nowhere”, a third-person action-adventure game published on the Oculus Rift and due for release in June
2016. Michael will walk through his process of composing for this special title and gives a post-mortem along the way while discussing the 6-month timeline, from early music sketches to the final hours of production. Involving intense music “design” and a 55-piece orchestra for the 2-hour long music score, he will present creative approach, insightful music examples, challenges encountered with the title and also VR, and provide a behind-the-scenes view of Wwise implementation.

12:35 - 1:05 Lessons from the Audio of "Job Simulator"

Daniel Perry

VIRTUAL REALITY TRACK: Through the development of Job Simulator, the premiere VR launch title for HTC Vive as well as Oculus Touch and PlayStation VR, I stumbled upon a lot of situations that have not been seen in any other form of interactive gaming.

I will talk about how the concept of ambience in VR has changed, and how it affected the decision making of
audio placement in Job Simulator.
I will talk about hand interaction with objects, and the importance being able to interact with every single object in VR, and and insight into the design of a specialized audio impact tool within Unity was created, and how we approached the unique challenge of draggable objects such as drawers and sliding doors.

12:10 - 1:10 FMOD Advanced Sound Design Techniques

FMOD Session

(Kevin Regamey) Learn about advanced techniques in FMOD Studio that you may not have known about. Learn about event nesting, modulation and automation, scatterer sounds, AHDSRs and more

12:10 - 1:10 Introduction to Wwise (Part 1) - Hands-on Quick Start to Game Sound Integration

Wwise Session

(Robert Brock) In this fast paced session, starting from a completely blank Wwise project, you'll quickly learn the interface, discover the most important elements of the software and see how creative you can get with sound effects integration. Most importantly, you'll build your assets into a game and be able to hear your sonic masterpiece in actual game play. Bring your Windows or Mac (OS X Mountain Lion v10.8 and up) computer and headphones and we'll provide the software and project files.


1:00 - 2:45 Break and Roundtable

1:45 - 2:45: ROUNDTABLE: Women in Game Audio

Roundtable discussion of Women in Game Audio. Discussion facilitated by Karen Collins, Becky Allen, Sally Kellaway.

Topics will explore the personal and professional experiences of the panellists and an exploration of the challenges of being a minority in the game audio industry. This includes discussion of challenges to freelancers and other competitive professionals; delving into the impact of internal and external barriers and what mentorship and training can be delivered to overcome these barriers. A focus on generating mentorship strategies and delivering advice to attendees will ensure that this panel will benefit a wide array of professionals, including permanently employed practitioners, employers and educators of all genders and backgrounds.

Roundtable will begin approximately 1:45 to give attendees a chance to get lunch

2:45 - 3:45 Composing Interactive Music (II)

Paul Lipson, Formosa

ESSENTIALS TRACK: In this session, Paul will continue his discussion of how to create an interactive game soundtrack. Paul will discuss how composing within the framework of an interactive music engine affects how you think, compose and implement your music.  Using examples from games, he will demonstrate different video game music implementation techniques.

2:45 - 3:45 Sound Design of League of Legends

Adam Swanson, Brad Beaumont, RIOT Games

PROFESSIONAL TRACK: Our talk will cover how we go about designing champion sound effects for League of Legends. As League of Legends has such a large selection of different character themes, creating iconic sound design that supports gameplay can be quite a challenge. We’ll start off by walking you through our process from the start of exploration through to final polish. We’ll talk about how we approach the feedback process, and some specific elements we feel help us continue to push the quality bar. The second half of our talk we will dig into a few specific champions and tear down their spells to show you how we created the sounds. We will discuss everything from the tools we used, how we get our sounds, our individual approaches and even some examples of what worked well and what didn’t.

2:45 - 3:45 Creating Immersive & Aesthetic Auditory Spaces for Virtual and Augmented Reality

Chanel Summers, Syndicate 17

VIRTUAL REALITY TRACK:Every year more complex interfaces and completely new forms of experiences emerge that we audio designers are called upon to master, each with its own brand new complications when it comes to creating immersive, compelling audio designs. But with wearable interfaces and head-mounted displays becoming more commonplace, it will be important to cultivate the skills to create quality audio that not only meshes seamlessly with the world and experience that is being created, but also does so in an artistic way that furthers the design goals of the game. This presentation will discuss the challenges and specific solutions for creating audio for interactive virtual and augmented reality experiences. It will reveal audio techniques that can be used today to advance storytelling and gameplay in virtual environments while creating a cohesive sense of place. And it will demonstrate processes and techniques used in shipping products to construct the audio for everything from immersive mixed reality experiences to multiparticipant, multi-site location-based games.

2:45 - 3:45 Mixing and Mastering your Game in Real-time with FMOD Studio

FMOD Session

(Kevin Regamey)Kevin focuses on the techniques used in the latest games using advanced topics such as VCAs, mixer snapshots and sidechainingto.

2:45 - 3:45 Introduction to Wwise (Part 2) - Game Play Simulation and Mixing/Performance Monitoring and Optimization

Wwise Session

(Robert Brock) For this hands on session, you’ll begin with a Wwise project that’s near completion. You’ll discover powerful features that let you easily manage and mix projects of any size. You’ll see how to simulate game play long before a playable game is available and once a playable version of the game is obtained, how to connect to it for real-time mixing and performance monitoring. You’ll then learn how to optimize your game so that it fits within memory and cpu budgets. Attending Introduction to Wwise (part 1) is preferred but not mandatory. Bring your Windows or Mac (OS X Mountain Lion v10.8 and up) computer and headphones and we'll provide the software and project files.


4:00 - 5:00 Crossing the Streams

Scott Selfon, Microsoft Advanced Technology Group

ESSENTIALS TRACK: In the hectic world of game audio production, programming, and overall game development, it's easy for the audio creator's head to get buried in capturing and implementing the daunting spreadsheet of cues right in front of them. This talk challenges us to break free of audio as part of a "post-production" world. We'll highlight some of the unique and innovative audio developments of the past decade where the audio has turned around the equation, and instead been a key "pre-production" component, driving the implementation rather than the other way around. While some of the illustrations will be well known, others include tricks and tips not seen since their initial demonstrations, challenging us to continue to push on that fourth wall and provide sound that not only matches the visuals, but provides surprises to the user.

4:00 - 5:00 Panel: YouTube, Music, Video Games and ...Money

Noah Becker (CEO AdRev), Jim Charne (Media Attorney), Jody Friedman (CEO HD Music Now), Jeremy Lim (Composer)


(Brian Schmidt, Moderator) The advent of user videos on platforms like YouTube present a complex maze of rights, payments and copyright issues. This panel will attempt to shed some light on what the issues—both legal and practical—are regarding music in games on YouTube, from fan-generated covers to Let’s Play videos. Is there a way for a game composer to make money from YouTube?

4:00 - 5:00 I’ve been working in VR Audio all year and all I’ve got to show for it is Headset Hair

Sally Kellaway (Zero Latency), Bonnie Bogovich (Schell Games), Daniel Perry (Owlchemy Labs), Varun Nair (Facebook)


In this self-moderating panel, a collection of professionals from across the VR/AR/MR scene will explore the experiences they have come across working in the field to date. With a particular focus on the pathway and adaptations that have occurred in the industry in the last year, we will discuss the experience of developing in a development/pre-consumer environment and the current consumer ready environment, and how this movement has impacted on VR audio.  As our panellists work in a range of VR/MR mediums, and developments across Seated, Room Scale, and Free Roam delivery mediums will be explored and contrasted.  Attendees will gain a sense of the challenges they may face, and the successes they may make when working on VR audio.  This panel will provide attendees a balanced view of working in VR Audio and should form a great starting point and series of thoughtful topics for those who already work within these mediums.​

4:00 - 5:00 Adaptive music with Gordon Ramsay Dash

FMOD Session

(Dren McDonald) Dren gives a real world demonstration of interactive music using a variety of techniques including transition timelines. 

4:00 - 5:00 Introduction to Wwise (Part 3): Interactive Music - Hands On from Composition to Gameplay (Robert Brock)

WWise Session

(Robert Brock) Interactive music is a buzz phrase that everyone has heard, but many are unclear about the workflow necessary to create a truly interactive score. To understand the process, you’ll be hands-on as you discover how to take a musical score from a conventional DAW into Audiokinetic’s Wwise software. There you’ll take a game that's void of any music, to one where the player’s actions control the musical soundscape.

During the process you'll learn how to:

Topics will cover:

  • Organize a composition in a conventional DAW so that it can be implemented into a game.

  • Understand how to create a dynamic score that can react to the player’s actions and changes to various circumstances in the game.

  • Learn how to implement custom created music while acquiring fundamental concepts such as re-sequencing and re-orchestration techniques, switching and transition systems.

  • Test and play the result of what you’ve learned using the game Cube.

For composers, being actively involved in the entire process from composition to gameplay will help you better understand a game studio’s needs so that you make better decisions during the earliest phases of the creative process. For integrators, the introduction to Wwise's interactive music features will serve as a launchpad for creating truly reactive and dynamic music within a game.

*Introduction to Wwise Part 1 and Part 2 recommended but not required.

5:30 - 6:30: Featured Talk: The Music of Star Wars: Battlefront

Gordy Haab

ALL Tracks

John Williams' legendary work creating the soundtracks for the Star Wars films are known as one of the most iconic and instantly recognizable pieces of film music ever composed.

Composer Gordy Haab was part of the DICE audio team that was fortunate enough to work with and build upon this legacy, as he was charged with creating the score for Star Wars: Battlefront.

In this talk, Gordy will detail how he isolated John Williams' motifs and then complimented them with his own work to create extended themes for the game that were a seamless blend of old and new.

6:30 - 9:00 Networking/Drinks/Mixer

Networking Mixer Event (Gold Ballroom)


Please reload

Sessions: Tuesday, Sept. 27

Download Agenda

Sessions: Wednesday, Sept. 28

9:30 - 10:30 Essential Game Audio Tech II

Brian Schmidt

ESSENTIALS TRACK: Continuation of Essential game audio tech

9:30 - 10:30 The Music of Killer Instinct:Season 3

Tom Salta, Klayton (aka Celldweller)

PROFESSIONAL TRACK: This talk will be a traditional postmortem discussion walking the audience through the music of Killer Instinct: Season 3. What makes this panel and soundtrack unique is that it brings together two established artists/producers...Tom Salta (aka Atlas Plug) and Klayton (aka Celldweller). The audience will definitely be engaged by these two dynamic speakers. (We also might have the audio director, Zach Quarles on the panel as well). KI Season 3 features highly interactive music (using Wwise) for eight distinctly different characters in a wide variety of styles, including Judge (the arbiter from Halo), Kim Wu (from the original KI), Tusk (with a live choir), Mira (with a full live orchestra and soloists) and others. We have already assembled a highly entertaining set of behind the scenes videos so you can get a sense of the content we plan to discuss. We will also get into the more technical aspects of how the music was assembled and integrated into Wwise.

9:30 - 10:00 Scalable Acceleration of Real-time Audio Processing Using Hardware-Partitioned GPU Compute Units

Carl Wakeland, AMD


10:05 - 10:35 3ME – A 3D Musical Experience

Andrea F. Genovese, Charles J.P. Craig Jr., Juan Simon Calle, Agnieszka Roginska


9:30 - 10:30 Mixing and Mastering the Audio without the Game

FMOD Session

( Liam de Koster Kjaer) Liam demonstrates powerful features of FMOD Studio that empower distributed development teams. Learn how API Capture can be used to debug, fix and remix game audio captured from a gameplay session, without needing to be connected to the game.

9:30 - 10:30 Introduction to Wwise; Hands-on Quick Start to Game Sound Integration

Wwise Session

See Description: Tuesday


11:00 - 12:00 Integrating Game Audio with Unity

Steve Horowitz & Scott Looney


Ever been interested in best practices for preparing your audio so that the trip from DAW to a game is as smooth as possible? Ever wonder how complex it might be to put your sounds in a game yourself? Wonder no more!
In this session, we'll take you through a step by step, hands on approach to integrating sound into the Unity 3d game engine. Our goal is to pull back the curtain, expose the workflow, and take the mystery/magic out of the process. We’ll cover a wide range of options, from Unity's internal audio setup, to working with audio middleware solutions like FMOD, Wwise, Fabric, and Master Audio. 
So, Join us, as we open up the hood and show off the inner workings of sound implementation in Unity. Audio integration can be daunting to sound designers and composers, but, we've found that with a little bit of context and hands on training, just about anyone can begin to develop fairly complex audio behaviors in a short time.

11:00 - 12:00 Beyond Wild Hunt Musical Evolution of The Witcher Series

Marcin Przybyłowicz


Music from The Witcher 3: Wild Hunt is one of the most praised video game soundtracks of 2015, thanks to it’s original musical approach and mixture of slavic folk with contemporary elements. This speech will take a closer look on how The Witcher’s musical style was further developed on Wild Hunt’s expansion packs Hearts of Stone and Blood and Wine , as well as on other games from Witcher universe. Lecture will cover analysis of both artistic and technical side of The Witcher soundtracks’ creative process.

Through practical examples and anecdotes, ateendees will be guided through all the steps, missteps and team effort that went into refreshing musical design of The Witcher series and get to know what kind of challanges may await for them while working on music for established franchise.

11:00 - 11:30 Investigating the Impact of Source Spectra on Spatialized Audio Content

Sally Kellaway


11:35 - 12:05 A FullStack Prototype for Designing, Performing and Integrating Expressive Generative Audio for Continuous Interactions

Christian Heinrichs, Andrew McPherson


11:00 - 12:00 Advanced Sound Design Techniques with FMOD Studio

FMOD Session

(Kevin Regamey) Learn about advanced techniques in FMOD Studio that you may not have known about. Learn about event nesting, modulation and automation, scatterer sounds, AHDSRs and more

11:00 - 12:00 Introduction to Wwise (Part 2); Game Play Simulation and Mixing/Performance Monitoring and Optimization

Wwise Session

See description Tuesday


12:00 - 1:45 Break and Roundtable

12:45 - 1:45 ROUNDTABLE: Game Audio Education

Roundtable discussion of issues affecting Game Audio Education. Discussion facilitated by Matt Donner

Round-table discussion amongst notable members of the Educational community to discuss the challenges and opportunities facing us. Among these are: 
- Technology, yesterday, today and tomorrow.
- Approaches to adopting and developing curriculum around "fresh tech"
- "Whom do you serve"? Do we serve students / studios / ourselves? What is the REAL goal of education and who really can benefit from a school? 
- Vocational vs. Degree - which one is right for me?
- The role of coding in Sound. When is a Sound Designer a programmer?
- Certifications - the potential marriage between Mfrs and schools?
- The role of quality? Making great sound wont matter on a mobile game that people play with the sound turned down.... or does it?
- Challenges in approaching "career converters". How to go from DJ to RTPC?

1:45 - 2:45 Game Audio Business Essentials

ESSENTIALS TRACK: Game music and sound design presents interesting and specific business challenges. This session will cover game audio contracts, developers, income streams, as well as how to create a bid and price your services.

1:45 - 2:45 From Seed to Superhero: A Plants vs Zombies:Heroes Audio Post Mortem

Becky Allen, Luca Fusi (PopCap/EA)


Plants vs Zombies: Heroes is a collectible card game for iOS and Android developed by EA's PopCap studio. Its audio is the storied end product of two years work to evolve an aesthetic, breach a genre and--hopefully--bring a series core fanbase into a brave new sonic world that feels still feels familiar.

In this post-mortem, two of the PvZ:Heroes audio team will speak to lessons learned over the course of the game's development: how reinforcing player choice with custom music themes, sound and vocalizations for each Hero and all 300+ collectible cards in the deck: matching mix to player headspace through "emotionally mapped" parameters; sound design and composition techniques discovered while defining the new sound of PvZ; how to gather intra-team support for ambitious audio in a mobile dev environment; and doing it all in a title that'd fit comfortably onto the app store.

1:45 - 2:15 Audio Visual Alchemy

Paul Hembree


2:20 - 2:50 Some Possibilities for Totalistic Cellular Automata in Generative Game Music

Isaac Schankler


1:45 - 2:45 Mixing and Mastering your Game in Real-time using FMOD Studio

FMOD Session

(Kevin Regamey) Kevin focuses on the techniques used in the latest games using advanced topics such as VCAs, mixer snapshots and sidechaining

1:45 - 2:45 Dynamic Mixing Tools and Techniques Using Wwise

Add some more info about this item...

(Simon Ashby) The non-linearity of games brings many challenges to mixing which requires specific tools and techniques that are non-existent for other type of media. Today’s game platforms provide enough processing power to support advanced audio pipelines by incorporating real-time dynamic mixing functionality. Using real-game practical audio examples such as mix snapshots, side-chaining, and HDR audio, this session will demonstrate the many positive benefits that dynamic audio mixing can have on modern sound design.


3:15 - 4:15 How to Think Like a Game Designer

Scott Looney: Soundsmith, Author, Researcher


As we are all well aware, game audio is a different beast when compared to audio for linear media. But besides the idea that we have lots of assets to keep track of and trigger, and that we will likely need to be working under some additional limitations in terms of resources, there is the added issue of the fact that games can be structured in nearly endless ways which will affect your approach to the sound design. Understanding this structure is vital to understanding how to design implementation for a particular game. As the author of The Game Audio Tutorial Richard Stevens says ‘Game audio forces you to think like a game designer’ 

In this talk, author, educator, and game audio guy Scott Looney will take you through a tour of different dilemmas and issues encountered in game audio which can be aided by understanding the demarcation and division between what he calls game logic and sound logic within the structure of the game itself. Solutions will be discussed and compared with a focus toward applying them generically using any available audio middleware, and understanding both the needs of the game design as well as the needs of the implementation designer in finding appropriate and efficient methods of triggering and controlling audio events. Being able to work well within the limitations of the game’s design will be an increasingly essential and valuable skill to have in the field.

3:15 - 4:15 Staying Power How to Have a Long, Thriving Career in Game Audio

MODERATOR: Emily Reese; Michael Bross, Caron Weidner, Adam Gubman, Stephan Schutze



In this session, we seek to explore topics relevant to contract audio developers composers and sound designers, VO and contract audio directors in an effort to demystify the process of professional development when you’re on your own. We will highlight the differences between working in and out of house, while offering suggestions, strategies, and anecdotes from our own work experience that will plant the seeds for attendees to cultivate their own booming entrepreneurial selves. Topics will be divided into several sides of the ‘audio entrepreneur self’.

Social entrepreneur we will demonstrate a few different networking styles, discussing what has and hasn’t worked for us, and why it can take so long to build a steady client base. The social side will also include how to trace trends in social networks that might lead to business opportunity.

Professional entrepreneur we will discuss the differences between the career ladder in and out of house, how your current project will influence your next move, and how to understand the market needs vs what you have to offer. We will also touch on how to track financial trends in games in order to determine where to look for your next gig.

Life entrepreneur this will cover some regular ‘life lessons’ that touch on self development, in both the creative and technical sides of audio, while nurturing your uniqueness in order to help you find your own voice and marketing angle. We will also discuss family balance, friends and familial sacrifice, and how to avoid burnout.

3:15 - 3:45 Wait, Where’d It Go? Realistic Representation of Moving Sound Sources in a Virtual Environment

Sam Hughes, PhD Candidate, University of York; The Sound Architect


With new technologies in development and released, such as Virtual Reality (VR), Augmented Reality (AR) and Mixed Reality (MR), there is a limited amount of research conducted into the impact of content delivered on these mediums, especially with regards to audio. This research aims to identify the reasoning behind using binaural representation of audio when trying to recreate realistic auditory environments, as opposed to other delivery formats. Comparing binaural playback over headphones to multichannel audio delivery methods, this presentation explores the difference localising moving sound sources.

The focus of this presentation is to explore how localisation performance is affected by different playback mediums and how to realistically emulate moving sound sources in a virtual environment. With VR, AR and other new technologies becoming so easily available, it is important we understand the potential effects of our practise. This presentation will make use of both comparative research from other screen mediums, other VR research and experimentation conducted by the presenter. Attendees will gain an understanding of introductory research that currently explores the use of moving sound sources in virtual environments, and how this applies to their practise as audio professionals in the game audio industry. This understanding should allow attendees to make informed creative decisions to heighten or control the quality of their work appropriately and with knowledge of the impact of their work.

3:50 - 4:20 Spatial Audio Modelling used to provide Artificially Intelligent Characters with Realistic Sound Perception

Brent Cowan, Bill Kapralos, Technology University of Ontario Institute of Technology


Despite the advancements in artificial intelligence (AI) for video games and virtual environments, the non-player characters (NPCs) do not perceive the world in a realistic way. NPCs are often all knowing (omniscient), meaning that they know where the player(s) is at all times without having to perceive them using one or more of the five senses (hearing, sight, touch, smell, and taste). However, it is not possible for a NPC to realistically simulate human behaviour without limiting the NPC’s knowledge to what they could have perceived by way of their senses. This includes prior knowledge, current sensory input, as well as information received through some form of communication (other NPC, video surveillance, alarm ringing, etc.). Many video games feature some form of combat as their core game-play mechanic (player interaction with the game world), and the NPCs take on the role of enemy units attempting to converge on the player’s position. The NPC’s visual perception is often simulated by checking for line-of-sight between the NPC and the player’s avatar. Line-of-sight is calculated by casting a ray through the scene starting from the NPC’s head and pointing toward the player’s avatar. If the line between the NPC and player is unobstructed by obstacles in the environment, than the enemy unit has line-of-sight, and the player has been detected. The NPC’s sense of hearing is often ignored completely, or simply distance based, without taking the environment into account. To simulate the NPC’s sense of sight, it is important to test for visual occlusion (blocking caused by objects in the environment). Similarly, acoustical occlusion must also be approximated in order to simulate the NPC’s sense of hearing. Inspired by our previous work that saw the development of an acoustical occlusion method used to approximate occlusion/diffraction effects for dynamic and interactive virtual environments and games (Cowan and Kapralos, 2015), here we apply this method to NPCs giving them the ability to “perceive” sounds and therefore behave in a more natural, and realistic manner.

3:15 - 4:15 Adaptive Music woth Gordon Ramsay Dash

FMOD Session

(Dren McDonald) Dren gives a real world demonstration of interactive music using a variety of techniques including transition timelines. 

3:15 - 4:15 3D Audio Implementation with Wwise

Wwise Session

(Simon Ashby)  This session will showcase 3D audio implementation techniques and approaches using the Wwise and a R&D project called “Wwise Audio Lab”: A series of game environments designed to help interactive audio designers experimenting with various 3D audio methods for VR and desktop.

In this session, you’ll see in Wwise and Unreal how to: 

  • Setup binaural technology plug-ins such as Auro3D Headphones and Oculus.

  • Mix compatibility between “on screen” (over speakers or headphones) and VR media.

  • Compare ambisonics and quad audio files for ambiances but also as an intermediate spatial representation format.  

  • Use good practices related to LOD (level of details) for VR audio.

Finally, an alpha version of Audiokinetic’s “Geometry Informed Reverberation” R&D project will also be presented in which dynamic early reflections versus standard “static” reverberation systems will be compared.


4:15 - 5:00 All Meet in Emerald: Break & Product Giveaways

Product giveaways from GameSoundCon Sponsors: Emerald

5:00 - 6:00 Audio Directors and Composers Panel

MODERATOR: Emily Reese

ALL TRACKS: Paul Lipson, Russell Brower, Becky Allen, Sarah Schachner. Moderated by Emily Reese

Please reload

bottom of page