top of page
Writer's pictureJoaquin De Losada

Talk Title: Simulating Tropical Weather in ‘Far Cry 6’ Pt.2

Effective talk: Weather simulation in games Part 2

Year of Talk: 2022


This second part comes because a second presenter had come in and was able to talk more about other sections of making the weather in Far Cry. Some explanations for certain procedures and materials are explained in the previous talk as well as the other blog post called “One type of lighting system in games.”


Rendering Features:


Constraints for ‘Far Cry 6’

The game would be released on 9 different platforms.

Xbox One, Xbox One X, Xbox Series S, Xbox Series X, PS4, PS5, PC, Stadia and Luna.

The requirement is to have 60 FPS in the next-gen consoles and 30 FPS in the last-gen consoles.

10 KM2 Open world area.

Day-Night cycle.

Indoor/outdoor environments

Etc

The engine for Far Cry primarily uses physically based calculations.


Most of the lighting is done through multi-scattering diffuse and two-directional reflections (BRDF). It is recommended that you also read the blog titled: One type of lighting system in games. There are some explanations on how light can be rendered in the sky which is extensively used in Far Cry.

BRDFs are stored in lookup tables where various 3d Textures exist.

Area lights with linearly transformed cosine also exist in lookup tables.

A fallback was to use GGX specular and lamertian diffuse.


Some system contains translucency. Specifically vegetation, canopies, and curtains. This was done by wrapping the object with a sub-surface scattering system and then adding another layer that contains a diffuse lobe.


Steve McAuley's talk from 2019 is meant to contain better explanations of the improvements. 


Global illumination is done with probes of light being placed by artists to properly light the map. The data for how the map is affected by the global illumination is captured and spread out in 13 different frames.


11 contain variations depending on the time of day, another frame meant to help determine any local light and how it affects the local area, and a final frame that is in charge of calculating the sky occlusion.


Setting up a look-up table for Humidity and turbidity in water and sky it allows for an easier time to reference the correct type of sky and lighting depending on the occasion.


The game also runs with real-time volumetric clouds which allows for more realistic weather changes over time.


This was also built on previous work done for previous titles. GDC talks that go more in-depth are the following:

Data setup: [Schneider 2015]

Ray marching: [Schneider 2015][Hillaire 2016]

Checkerboard render [Bauer 2019]

Cloud lighting model:


A certain amount of light is lost when passing through an object, either through absorption or reflection. This effect is called the Beer Lamber Law. This idea is the same as the multi-scattering diffuse talked about in the other blog post. The following image is the equation used to calculate how the scattering of the light is affected in the clouds.


Cloud data:

Initially, it is calculated using an in-house tool.


It contains a combination of frequency and noise type to help determine the particle density. The engine will then sample the texture and blend to give better cloud shapes.

Then another weather map is created to help interpolate between different cloud levels.

Using a curl noise texture allows for more randomness to be introduced in the cloud edges.

When calculating how the player sees the light that passes through the clouds it is based on the player's POV a technique used ray march is used. Similar to ray tracing it will have a ray go towards a cloud and bounce towards the light.


This can still be quite costly when running each frame so it is normally done at half resolution.


This is then rendered using a checkerboard system where every frame varies how the ray is transmitted. All of this information is then gathered together and adjusted depending on different factors to help smooth the process. The next part of creating cloud cover was creating shadows on the ground. The solution was to use Orthographic projection which allows for configurable shadow from clouds as well as volumetric fog.


Volumetric Fog:

Uses a Frustum-aligned volume grid which was previously explained in two talks given by Wronski in 2014 and Hillaire in 2015.

Fog can be set up to be distinct depending on the local and global fog as well as any differences in the region.


The following graphs show how the volumetric fog is rendered.




The game initially starts by setting up the game visuals that are represented in the first three steps. Fill cells and forward are the principal areas where the volumetric fog is calculated and rendered. When filling the fog it is filled with a radiance using local density and the fog particles. Then any lighting system is taken into consideration to how it can affect the visual of the fog.


In the last-gen consoles, it calculates using temporal filtering which requires more memory to properly run but does not require as much computational power.

The next step is calculating the sum cells which calculates the entire fog system from the perspective of the camera forward.


In the next-gen consoles and PCs, there is also a bilateral blur implemented which helps with the writing and reading speeds of the information while keeping the original texture.

A problem that can occur is if an incorrect sample occurs it can cause bands to occur in a stair fashion. The solution found for Far Cry 6 was to blur the shadow map to help better smooth out the light. This while also downsampling allows for any staircase effect as everything can easily get smoothed over.


At the same time, the game allows for the ability to upsample and apply both the fog and clouds simultaneously. There are still certain angles/distances that can cause problems due to the engine and system. Especially as there are way more variables to take into account that can change in different parts of the map. The fallback that was chosen is called cube maps


The data for the cube maps stores different materials. From albedo, normal, and depth. It starts by relighting one face of the cube per frame. After this, it will start filtering out any rough surfaces that need blurry reflection and calculate sample GGX for different MIP maps.


A problem that would occur is during overcast weather the lighting system might not have been bright enough and the contrast between cloud and non-cloud areas. According to the presenter, this can happen for a few reasons. A couple of the main reasons mentioned involve authored areas containing too much/little cloud cover which can affect how much of the rest of the sky can be seen by the player. At the same time if the fog is too dense or the cube maps are incorrectly processed then it can cause the screen to look washed out.


Rain (GPU Particles):

A rain system was implemented using a combination of systems that had been previously mentioned in part one (1) of the blog as well as leveraging GPU particle effects to create realistic rain when needed.


The following graphs show how the GPU processes the rain simulation and particles.




When trying to sort all of the particles they use the Bitonic Sort algorithm. Which involves testing multiple pairs of values in a list/array at the same time. Depending on which half of the list the numbers are being tested then it will set up the values in ascending or descending order. This system allows for multiple numbers to be checked with their pairs at the same time.


The algorithm has a rough Big O notation of O(Log2N) for average cases which incurs a somewhat high average cast but also has extremely low worst cast cost allowing for a lower risk of having high computation problems.


At this point, once everything has been sorted the values need to be sorted to allow for an easier time determining when to render certain particles. For this reason, a prefix sum filtering algorithm was used as the parallelized algorithm works well in a GPU environment. The algorithm is meant to grab two of the values where value B is moved to value A’s position while A and B are added together and replace B’s original position.


Rain Streaks

Initially, the team tested with blur, refraction, and reflections but decided on using transparent textures to represent rain streaks. For efficiency particles would be reused around the player. As well as a 3D noise texture to allow for more randomness in the rain movement. The weather system also had an important factor when determining the rain's direction. A rain shadow map allowed for better inclusion/exclusion of rain droplets depending on location as there existed many indoor areas that would not be affected by rain.


As well color atlas map also allowed for a lookup table to be created which allowed for people to have an easier time determining where rain could fall which allowed for the effect to occur more times.


Rain Splashes

Initially, only an event system was created for when rain would hit an object. It was realized that the effect didn't occur as often at times so a later version allowed for more particles to be spawned near the player to increase the effects that can be seen.


Rain lighting

Trying to set up proper pixel lighting for the rain would prove to be extremely costly. Instead, a few points in the rain have a spherical harmonic light probe that calculates how it is affected by any sun, point, and spotlighting effects. This also allows for textures to be used to improve the lighting effect.


Lighting

A linked trail system of particles to represent the lightning bolts. Added turbulence as one goes down the trail allowing for more variation in the bolt. Implemented a new light source in the clouds that would allow for more realistic thunderclouds. Included scattering diffuses to the clouds and lights to give more shape to the effects.


Ocean

The main updates from previous games revolved around supporting more theme types, especially tropical ones. This meant improving the effect that wind direction and Beaufort level (Wind speed) would have on the ocean. The team had difficulties with tiling the water and the ocean at distances did not look as good. They ended up creating an improved tessellation system in the game. The ocean is used mostly for subdivision while trying to tesselate the water. Especially when closer to the camera. A buffer was also created to help transition between certain groupings when triangles would get subdivided or merged. The different tessellations that were used in the game had to be used to properly integrate the freshwater meant to be coming from the river with the ocean water.


Wave simulations

When trying to calculate waves that appear close to the player they used the world space Fractional Brownian Motion (FBM). I was unable to find a concrete answer for what FBM is. But from what I gather it meant to help to mathematically calculator how an item breaks off and fractionalized over an area. So you can use that to create a detailed system for how multiple waves in the ocean might collide with each other breaking off into fractionalized waves and areas.


Meanwhile, waves that were further away used the Fast Fourier transform which allows for quicker more pronounced waves so that the player has an easier time seeing them. White tops were included to show which areas were meant to be the tops of waves and improve looks. This also helped make it seem like it was responding to the wind direction and speed.


Shoreline Wave

All the waves were made procedurally using the Gerstner Wave function. This allowed it to properly affect how it might bend and move on the shore better.


Tree bending

The game was able to implement more of the tree's ability to bend according to wind strength in the area. Tools where set up which allowed for artists to have some extra control over how many trees could bend and other factors.


At times if trees bent out of the bounding box it would be culled as it could easily become to expensive to render and calculate.


2 views0 comments

Recent Posts

See All

Comments


bottom of page