Planet Zoo features a dynamic (fast moving) player controlled camera and user generated content. Terrain, objects and environments can be freely manipulated and changed. To emphasize those changes; Early reflection, occlusion, filtering of environmental audio and weather are informed by the position of the camera and by ray-casting against the surrounding environment in real-time. It is a performant system (as opposed to fully simulating propagation) that is designed to emphasize the changes made by a player.
This audio show-reel highlights the work of the audio team. It has not been endorsed by Frontier Developments or Universal pictures.
It is all of equally importance but doesn't always align with the expectations of modern game audio. For example, when the game - for wanting to avoid fatigue - requires variations, nostalgia might lean towards familiar and repetitive sounds.
For the audio team this was a blast of a puzzle to solve: Figuring out what to add, leave alone and update. Senior audio designer Duncan MacKinnon and I spend the first three months on Jurassic World Evolution trying to answer those questions. We experimented with substitution, re-creation, layering and morphing to find a combinations that could add behaviours not seen in the films, yet sounding like a natural extension of the existing sonic signature. When at first we added layers of new recordings to achieve this, we ended up dropping most as it was detracting from the nostalgia requirement when trying to update too much. Instead we ended up relying on techniques that take the original sound as a starting point for adding sweeteners, fx, mastering and creative mixing techniques, such as morphing.
On JW:E the audio direction ended up outlining a diegetic framework. Location is intrinsically linked to the user controlled camera so defining what the camera represents was a first logical step. We could have approached the camera as a "god-like"-figure that requires presence, or perhaps a detached strategic insight, which requires exaggerated audio cues for feedback.
For the purpose of building a cohesive audio soundscape, we needed an answer that could anchor to the flow of gameplay and for that it might be good to explain how the game-loop works on our games.
In strategy and management games that Frontier Developments builds, the game context switches between observing, managing feedback and building accordingly. Sometimes that happens while interacting with the diegetic game-world and sometimes using non-diegetic full screen menu overlays.
To define the camera, I took guidance from the full screen overlays as Ui ties most of the gameplay flow together anyway. Unlike the camera the diegetic crowds, dinosaurs and weather can't be directly controlled, but it can be read as if it is a "living" diegetic Ui.
The audio story thus goes that Ui overlays are an observation, made from a physical command-room while looking at screens and readouts. This translates to audio direction that while menu screens are active, a hint of a command-room-ambience is present, which becomes encompassing (with bespoke music) when entering full screen overlays. In the control-room-ambience, the beeping of readouts and distant chatter is heard to give those menus an air of presence. It's not necessarily visualised to the player as such, but the impressionistic audio contextualizes the gradual transitions from diegetic and half covering Ui's to full screen overlays.
Depending on how much space a menu occupies on screen (from subtle side-bar, to full screen overlay) more "room reverb" is added to Ui sounds. This is most pronounced when transitioning from diegetic visuals with small menu interfaces, to full screen overlays.
When it rains a muffled layer is heard when browsing those full screen menus and thus the storm is also present where the "command room" is. Even if it isn't accessible to the player in the diegetic world, audio and ui screens link the experiences together.
A similar diegetic approach solved the challenge of skippable cinematics that play when releasing a new dinosaur. The cinematic presentation features bespoke music for each dinosaur and if the player skips this sequence, the camera returns to observing the dinosaur from the previous "command room" perspective. Using the diegetic framework, I suggested we could switch the music from a stereo presentation (non-diegetic) to appear coming from the building to suggest the music was always diegetic. The camera can then "move away" from the music, fading it out.
Vehicles came mid-way through production and I wanted to explore music playing from their location. As vehicles drive from fields and forests to build-up areas, reverbs and reflections change accordingly. A vehicle with music playing from it highlights those transitions and emphasizes the player created environment through music propagation.
The team ran with it, Pablo Cañas Llorente, Janesta Boudreau and James Stant manged to license a surprisingly large and varied playlist. From those efforts a radio station was created and Pablo and James wrote and recorded a Spanish speaking DJ, broadcasting from the far away shore of Costa Rica. This is another example of using "location" to inform audio direction. To emphasise the distance of the islands from the continent and make the broadcast physically feel real, I used game parameters related to storms to filter the radio music. Before a storm hits the island, the radio begins to crackle and break up and as the storm fully engulfs, there's only static left. The storm is part of a larger world in which the radio signal needs to travel through the ether.
In part because of our diegetic informed direction for audio, our work helps underpin the setting and add to world-building in Jurassic World Evolution.
Jurassic World Evolution reaching the Steam global top seller status on the day of release.