52
Department of Computer Science and Engineering CHALMERS UNIVERSITY OF TECHNOLOGY UNIVERSITY OF GOTHENBURG Gothenburg, Sweden 2016 Convincing Cloud Rendering An Implementation of Real-Time Dynamic Volumetric Clouds in Frostbite Master’s thesis in Computer Science Computer Systems and Networks RURIK HÖGFELDT

Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

  • Upload
    others

  • View
    6

  • Download
    0

Embed Size (px)

Citation preview

Page 1: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

Department of Computer Science and Engineering CHALMERS UNIVERSITY OF TECHNOLOGY UNIVERSITY OF GOTHENBURG Gothenburg, Sweden 2016

Convincing Cloud Rendering An Implementation of Real-Time Dynamic Volumetric Clouds in Frostbite Master’s thesis in Computer Science – Computer Systems and Networks

RURIK HÖGFELDT

Page 2: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

Convincing Cloud Rendering - An Implementation of Real-Time Dynamic Volumetric Clouds in Frostbite Rurik Högfeldt © Rurik Högfeldt 2016 Supervisor: Erik Sintorn Department of Computer Science and Engineering Examiner: Ulf Assarsson Department of Computer Science and Engineering Computer Science and Engineering Chalmers University of Technology SE-412 96 Gothenburg Sweden Telephone +46 (0)31 772 1000 Department of Computer Science and Engineering Gothenburg, Sweden, 2016

Page 3: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

Abstract

This thesis describes how real-time realistic and convincing cloudswere implemented in the game engine Frostbite. The implemen-tation is focused on rendering dense clouds close to the viewerwhile still supporting the old system for high altitude clouds. Thenew technique uses ray marching and a combination of Perlin andWorley noises to render dynamic volumetric clouds. The cloudsare projected into a dome to simulate the shape of a planet’s at-mosphere. The technique has the ability to render from differentviewpoints and the clouds can be viewed from both below, insideand above the atmosphere. The final solution is able to render re-alistic skies with many different cloud shapes at different altitudesin real-time. This with completely dynamic lighting.

Page 4: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

Acknowledgements

I would like to thank the entire Frostbite team for making me feelwelcome and giving me the opportunity to work with them. Iwould especially want to thank my two supervisors at Frostbite,Sebastien Hillaire and Per Einarsson for their help and guidancethroughout the project.

I also want to thank Marc-Andre Loyer at BioWare for his helpand support with the development and implementation.

Finally, I would also like to thank my examinator Ulf Assarssonand supervisor Erik Sintorn at Chalmers University of Technologyfor their help and support.

Page 5: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

Contents

1 Introduction 71.1 Motivation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 71.2 Goal . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8

1.2.1 Approach . . . . . . . . . . . . . . . . . . . . . . . . . . 81.3 Report Structure . . . . . . . . . . . . . . . . . . . . . . . . . . 8

2 Cloud Physics 92.1 Types . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.2 Behavior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.3 Lighting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11

2.3.1 Absorption . . . . . . . . . . . . . . . . . . . . . . . . . 112.3.2 Scattering . . . . . . . . . . . . . . . . . . . . . . . . . . 122.3.3 Extinction . . . . . . . . . . . . . . . . . . . . . . . . . . 122.3.4 Transmittance . . . . . . . . . . . . . . . . . . . . . . . 132.3.5 Emission . . . . . . . . . . . . . . . . . . . . . . . . . . 132.3.6 Radiative Transfer Equation . . . . . . . . . . . . . . . 13

3 Related Work 14

4 Implementation 164.1 Integration into Frostbite Sky Module . . . . . . . . . . . . . . 16

4.1.1 Cloud Module . . . . . . . . . . . . . . . . . . . . . . . 164.1.2 Cloud Shader . . . . . . . . . . . . . . . . . . . . . . . . 17

4.2 Noise Generation . . . . . . . . . . . . . . . . . . . . . . . . . . 184.2.1 Perlin Noise . . . . . . . . . . . . . . . . . . . . . . . . . 184.2.2 Worley Noise . . . . . . . . . . . . . . . . . . . . . . . . 18

4.3 Cloud Textures . . . . . . . . . . . . . . . . . . . . . . . . . . . 204.4 Weather Texture . . . . . . . . . . . . . . . . . . . . . . . . . . 20

5

Page 6: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

CONTENTS CONTENTS

4.4.1 Precipitation . . . . . . . . . . . . . . . . . . . . . . . . 214.4.2 Wind . . . . . . . . . . . . . . . . . . . . . . . . . . . . 22

4.5 Shape Definition . . . . . . . . . . . . . . . . . . . . . . . . . . 224.5.1 Density Modifiers . . . . . . . . . . . . . . . . . . . . . . 224.5.2 Detailed Edges . . . . . . . . . . . . . . . . . . . . . . . 22

5 Rendering 245.1 Ray Marching . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245.2 Render From Inside Atmosphere and Space . . . . . . . . . . . 255.3 Dome Projection . . . . . . . . . . . . . . . . . . . . . . . . . . 265.4 Height Signal . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265.5 Lighting . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

5.5.1 Phase Function . . . . . . . . . . . . . . . . . . . . . . . 285.5.2 Shadows . . . . . . . . . . . . . . . . . . . . . . . . . . . 295.5.3 Ambient . . . . . . . . . . . . . . . . . . . . . . . . . . . 30

6 Optimization 316.1 Temporal Upsampling . . . . . . . . . . . . . . . . . . . . . . . 326.2 Early Exit . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 33

6.2.1 Low Transmittance . . . . . . . . . . . . . . . . . . . . . 336.2.2 High Transmittance . . . . . . . . . . . . . . . . . . . . 34

7 Results 367.1 Resources . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 367.2 Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 377.3 Visual Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . 39

8 Discussion and Conclusion 418.1 Limitations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 428.2 Ethical Aspects . . . . . . . . . . . . . . . . . . . . . . . . . . . 42

9 Future work 43

10 References 45

A Additional Visual Results 46

B Artist Controls 50

6

Page 7: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

1Introduction

Real-time rendering of realistic and convincing cloud scenes has always beena desired feature in computer graphics. Realistic and convincing cloud scenesis not only the result of light scattering in participating media but also theresult of dynamic clouds that can evolve over time, cast shadows and interactwith its environment. Many cloud rendering techniques have been developedover the years and are still being researched. Rendering realistic and convinc-ing cloud scenes is still a difficult task as clouds are not only volumetric anddynamic but also requires a complex light transport model.

The cloud system presented in this thesis is inspired by a recent techniquedeveloped by Schneider and Vos [AN15]. The main additions and changes tothis technique is a different and unified height signal and control. The amountof resources required has also been reduced.

1.1 Motivation

As the computing power increases so does the possibility to use new tech-niques that previously were only suited for offline rendering to be used inreal-time. The area of cloud lighting and rendering is well-researched in com-puter graphics. Many different techniques for rendering realistic clouds havebeen developed over the years, but they are often not scalable enough to beused in real-time rendering in a game context. One common solution is to useimpostors or panoramic textures that are applied to the sky. This can producerealistic high definition clouds but it is a very static solution. Using it within aframework featuring dynamic time of day and animations would be very diffi-

7

Page 8: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

1.2. GOAL CHAPTER 1. INTRODUCTION

cult. These two-dimensional solutions are also not well suited for clouds closeto the viewer as they will appear flat and not give a realistic representationof clouds. However these techniques might be suitable for thin layered cloudsfar away from the viewer.

1.2 Goal

The goal of this thesis was to investigate and implement a technique for ren-dering realistic and convincing clouds in real-time. The rendering techniqueshould be able to produce a vast amount of clouds that are different in bothshape, type and density, while still being dynamic and evolve over time. Al-though the focus of this technique is on dense clouds close to the viewer. Itshould be easy to control for an artist via a weather system that can modelthe clouds shape and behavior. The clouds should also be rendered into aspherical atmosphere which would allow them to bend over the horizon. Itis also important that clouds can receive shadows from its environment, bothfrom the planet and from other clouds.

1.2.1 Approach

The approach was to investigate a different solution that can render moreinteresting and complex cloud scenes. This was be done by studying recentadvances in volumetric rendering of clouds and see if and how they can beimproved and applied in a game context. The solution was evaluated bothby the time it takes to render and the amount of resources required. In or-der to evaluate how realistic the clouds are actual photographs are used forcomparison.

1.3 Report Structure

This paper has the following structure. In Section 2 different cloud types,lighting, their behavior and how they appear are described. Section 3 intro-duces the reader to previous research of rendering clouds by other authors. InSection 4 the implementation and modeling of these clouds are covered. InSection 5 rendering is described. Section 6 covers optimizations needed forachieving real-time performance. Section 7 presents results from this render-ing technique with and without optimizations. This section also offers a visualcomparison between in-game rendering and photographs. In Section 8 is a dis-cussion and conclusion of the results and limitations from this implementation.Section 9 covers proposed future work. Finally in Appendix A is additionalvisual results and in Appendix B is a description of exposed controls.

8

Page 9: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

2Cloud Physics

Water vapor is invisible to the human eye and it is not until water condensatesin air that clouds appear. This section provides a short introduction to dif-ferent cloud types and how they are named. It also covers behavior, physicalproperties and simplifications made for modeling clouds. Finally it offers asection about lighting and radiative transfer in participating media.

2.1 Types

Cloud can appear in many different shapes and variations. Most cloud typesare named after a combination of its attributes. Attributes that refers to thealtitude are Cirrus for high altitude and Alto for mid altitude. The attributeCumulus is used for clouds that have a puffy appearance. Clouds that havethe attribute Status appears as layers. The last common attribute is Nimbuswhich is used for clouds with precipitation. In Figure 2.1 common cloud typesand their names are displayed.

2.2 Behavior

When modeling clouds it is important to follow their physical behavior forthem to be realistic. Rendering clouds without taking physical behavior intoaccount will not yield convincing and realistic results [BNM+08]. Temperatureand pressure are key components of how clouds form and behave. As watervapor rises with heat into the atmosphere, where it is colder, the water con-densates and form into clouds. Air temperature decreases over altitude andsince saturation vapor pressure strongly decreases with temperature, dense

9

Page 10: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

2.2. BEHAVIOR CHAPTER 2. CLOUD PHYSICS

cb

aValentindeBruyn

Figure 2.1 – Common cloud types and their names

clouds are generally found at lower altitudes. Rain clouds appear darker thanothers which is the result of larger droplet sizes. This is because larger dropletsabsorbs more and scatter less light [Mh11].

Wind is another force that drives clouds and is caused by differences in pres-sure at different parts of the atmosphere. Clouds can therefore have differentwind directions at different altitudes. Since our focus is low altitude cloudsclose to the viewer we assume that all of these clouds move in the same winddirection. This makes the behavior quite complex so a few simplifications weredone. Some of these simplifications are shown in the list below.

• Cloud density : The density inside a cloud increases over altitude but isindependent of where in the atmosphere the cloud appears.

• Wind direction: Since focus is on low altitude clouds we only take onewind direction into account.

• Droplet size: In our cloud model we assume that clouds always have thesame droplet size and instead only the density varies.

• Precipitating clouds: Instead of modeling precipitating clouds with dif-ferent droplet sizes we increase the absorption coefficient of these clouds.

• Atmosphere shape: We assume that atmosphere can be treated as aperfect sphere instead of an oblate spheroid.

10

Page 11: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

2.3. LIGHTING CHAPTER 2. CLOUD PHYSICS

• Direction to sun: We assume that the direction to the sun can be treatedas parallel within the atmosphere.

2.3 Lighting

This section covers light behaviour when traveling through participating me-dia. Real world clouds do not have a surface that reflects light. Insteadlight travels through them at which photons interact with particles which mayabsorb or scatter them, which causes a change in radiance. There are four dif-ferent ways radiance may change in participating media, these four differentways are described in Figure 2.2. It can be due to absorption, in-scattering,out-scattering or emission [Jar08].

Figure 2.2 – The four ways light can interact with participatingmedia. From left to right, Absorption, Out-scatter, In-scatter and

Emission

2.3.1 Absorption

Absorption coefficient σa is the probability that a photon is absorbed whentraveling through participating media. When a photon is absorbed it causesa change in radiance by transforming light into heat. Reduced radiance dueto absorption at position x when a light ray of radiance L travels along ~ω isgiven by Equation 2.1.

e−σa(x)dtL(x, ~ω) (2.1)

Rain clouds are generally darker because they absorb more light. This isbecause rain clouds have a higher presence of larger water droplets, which aremore effective at absorbing light.

11

Page 12: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

2.3. LIGHTING CHAPTER 2. CLOUD PHYSICS

2.3.2 Scattering

Radiance may increase due to in-scattering or decrease due to out-scattering.The coefficient σs is the probability that a photon will scatter when travel-ing through participating media. Increased radiance due to in-scattering isshown in Equation 2.2. In this equation P (x, ~ω) is a phase function, whichdetermines the out-scatter direction from the light direction ~ω. Many differ-ent phase functions exists and are suitable for different types of participatingmedia. The phase function can scatter light uniform in all directions as theisotropic or scatter light differently in forward and backward directions. Aphase function for clouds can be very complex as seen in Figure 2.3, whichwas used in [BNM+08].

σs(x)Li(x, ~ω)

Li(x, ~ω) =

∫Ω4π

P (x, ~ω)L(x, ~ω)d~ω(2.2)

Figure 2.3 – Plot of a Mie phase function used for cloudsin [BNM+08]

Clouds are generally white because they scatter light independently of wavelength as oppose to atmospheric scattering which scatters blue wave lengthsmore than others.

2.3.3 Extinction

Extinction coefficient σt is the probability that photons traveling through aparticipating media interacts with it. The probability that a photon interactsand therefore causes a reduction in radiance is the sum of the probabilities forphotons either being absorbed or out-scattered as described in Equation 2.3.

σt = σa + σs (2.3)

12

Page 13: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

2.3. LIGHTING CHAPTER 2. CLOUD PHYSICS

2.3.4 Transmittance

Transmittance Tr is the amount of photos that travels unobstructed betweentwo points along a straight line. The transmittance can be calculated usingBeer–Lambert’s law as described in Equation 2.4.

Tr(x0, x1) = e-∫ x1x0

σt(x)dx(2.4)

2.3.5 Emission

Emission is the process of increased radiance due to other forms of energy hastransformed into light. Increased radiance because of emission at a point xalong a ray ~ω is denoted by Le(x, ~ω). Clouds do not emit light unless a lightsource is placed inside it.

2.3.6 Radiative Transfer Equation

By combining the equations of the four ways light can interact with partici-pating media, it is possible to derive the radiative transfer equation thoughthe law of conservation of energy. The radiative transfer equation is shown inEquation 2.5 and describes the radiance at position x along a ray with direc-tion ~ω within a participating medium [Cha60].

L(x, ~ω) =Tr(x, xs)L(xs, -~ω)+∫ s

0Tr(x, xt)Le(xt, -~ω)dt+∫ s

0Tr(x, xt)σs(xt)Li(xt, -~ω)dt

(2.5)

13

Page 14: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

3Related Work

Techniques for rendering clouds are a well-researched area with many recentnew breakthroughs and ideas of how to render realistic clouds. Hunagel et al.presented in [HH12] a comprehensive survey on research and development inrendering and lighting of clouds. In this survey the authors compare differenttechniques and weigh them against each other. Some techniques that are cov-ered are billboards, splatting, volume slicing and ray marching. A table withtechnique and suitable cloud type is presented, making it easy for the readerto compare.

One recent technique is a cloud system developed by Schneider et al. [AN15]that is going to be used in the game Horizon Zero Dawn. By using a combi-nation of Worley and Perlin noise and ray marching, the authors manage torender very realistic cumulus shaped clouds in real-time under dynamic light-ing conditions.

Another recent rendering technique for clouds was developed by Egor Ysov[Yus14]. This technique uses pre-computed lighting and particles to renderrealistic cumulus clouds. But this technique depends on a feature called PixelSynchronization for providing volume aware blending which is only availableon Intel HD graphic cards [Sal13]. Volume aware blending could be imple-mented using rasterizer ordered views which is a new feature in DirectX 12,but since our implementation must work on all three platforms PC, PlaySta-tion and Xbox this technique is not suitable for our use case.

14

Page 15: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

CHAPTER 3. RELATED WORK

Bouthors et al. propose in [BNL06] a technique for real-time realistic illu-mination and shading of stratiform clouds. By using an advanced lightingmodel were they account for all light paths and preserves anisotropic behavioras well as using a physically based phase function they manage to render re-alistic clouds in 18-40 fps. This technique is limited to a few cloud types andis mostly suitable for stratiform clouds.

In [BNM+08], Bouthors et al. propose a technique for simulating interac-tive multiple anisotropic scattering in clouds. With this technique using asimilar approach to [BNL06] they manage to also render very realistic lightingof detailed cumulus typed clouds in 2-10 fps.

15

Page 16: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

4Implementation

This section covers how our new technique was implemented and added to thesystem. It also covers resources required, how they are used and how they aregenerated.

4.1 Integration into Frostbite Sky Module

Frostbite has several different sky modules that can be used for rendering skies.Our implementation is focused on a sky module called Sky Physical [BS14].This section provides an overview to the new technique and how it is imple-mented. First we described how our cloud module was added to the system.The cloud module is responsible for maintaining results from our shader andproviding input to it. Then is an short overview of how the shader works.

4.1.1 Cloud Module

A system diagram of the previous technique for rendering clouds before thisimplementation is shown in Figure 4.1. The clouds were rendered by applyingstatic textures called panoramic and cloud layers to the sky. This techniquecan produce convincing skies but it is limited to distant clouds because cloudsclose to the viewer will appear flat.

In order to produce realistic clouds close to the viewer a dynamic cloud render-ing technique was needed. The static cloud texture solution was only suitablefor distant clouds under static lighting conditions. The new solution shouldnot only produce dynamic clouds but also improve the previous cloud lighting

16

Page 17: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

4.1. INTEGRATION INTO FROSTBITE SKY MODULECHAPTER 4. IMPLEMENTATION

model by handling dynamic lighting conditions. Therefore a cloud modulewas developed as shown in Figure 4.2. This module can be added by artistsif needed for a scene. The cloud module renders to a texture which is thenprovided to the sky module. This texture is rendered once for every frame,providing a dynamic solution for rendering clouds. Since the new implemen-tation is focused on rendering clouds close to the viewer the old cloud layersystem can still be used for adding high altitude clouds.

Cloud Layer Textures Panoramic Texture

Sky Module

Apply to skyApply to sky

Figure 4.1 – Diagram showing how clouds were rendered before

Cloud Module

Volumetric Cloud Texture Cloud Layer Textures Panoramic Texture

Sky Module

Render

Apply to sky

Apply to sky Apply to sky

Figure 4.2 – Diagram showing how cloud module was added

4.1.2 Cloud Shader

We also created a single shader to use with our cloud module. This shaderis dispatched from our cloud module and uses ray marching together withdifferent noise textures to render volumetric clouds. The following Section 4.2covers these noises and the shader is covered in detail in Section 5.

17

Page 18: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

4.2. NOISE GENERATION CHAPTER 4. IMPLEMENTATION

4.2 Noise Generation

This section covers the different noises that are used to create the cloud shapesand how these are generated. A combination of both Perlin and Worley noisesare used to create clouds shapes. We pre-generate these noises in two differentthree-dimensional textures on the CPU and then use them in the shader.

4.2.1 Perlin Noise

In 1985 Ken Perlin presented a technique for generating natural appearingnoise [Per85]. Since then this technique has been widely used generatingnoise for many natural phonemes including clouds. We generated a tiling threedimensional texture using a program developed Stefan Gustavson [Gus05]. InFigure 4.3 is a three dimensional tiling texture with this noise mapped to acube.

Figure 4.3 – Tiling 3D Perlin noise applied to a cube

4.2.2 Worley Noise

Stewen Worley described a technique for generation cellular noise in [Wor96].In Figure 4.4 is a cube with three-dimensional tiling Worley noise. We usethis noise type to create both wispy and billowing shaped clouds. By invert-ing Worley noise it is possible to control the appearance between wispy andbillowing and vice versa. The algorithm for creating a texture with this noisecan be quite simple. A naive approach would be to generate a set of pointscalled feature points and then shade every texel by its distance to the closestfeature point. Generating the noise this way would be very slow especially inthree dimensions. Therefore the naive approach was optimized as described

18

Page 19: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

4.2. NOISE GENERATION CHAPTER 4. IMPLEMENTATION

in the steps below.

1. Subdivide a cuboid into equally sized cells.

2. For each cell, randomly place a feature point inside it. There must beexactly one feature point per cell.

3. For each point inside the cuboid color it by the Euclidean distance to theclosest feature point. This distance is found by evaluating the featurepoint inside the surrounding 26 cells and the feature point inside thecurrent cell. By wrapping cells at the edges the texture will be tileablein all three dimensions.

Figure 4.4 – Tiling 3D Worley noise applied to a cube

Since we place exactly one feature point inside each cell this ensures that therewill not be any large dark areas which could be the case if feature points wererandomly positioned. In Figure 4.4 a cube with size 1283 and a cell size of 16is used which yields exactly 8 feature points in any direction in the cube. Theresult is cellular noise that looks random without having any large dark areasdue to feature point being too far apart.

This algorithm can generate different octaves of Worley noise by changingthe cell size. An octave is half or double of the current frequency and appearsat the interval 2n. For example generating Worley noise with 4 octaves and acell size of 2 as starting frequency makes the following three octaves to be 4,8, 16.

19

Page 20: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

4.3. CLOUD TEXTURES CHAPTER 4. IMPLEMENTATION

4.3 Cloud Textures

The noises are pre-generated and stored in two different three dimensionaltiling textures as described in Table 4.1. In Figure 4.5 is an example of boththe shape and detail texture. The first three dimensional texture is used tocreate the base shape of the clouds. It has four channels, one with Perlin noiseand three with different octaves of Worley. Our clouds will repeat less in they axis and therefore the size in this axis is smaller in order to reduce texturesize. The second three dimensional texture is used for adding details and hasthree channels with different octaves of Worley noise.

Table 4.1 – Noise textures used to create the cloud shapes

Texture Size R G B A

Shape 128x32x128 Perlin Worley Worley Worley

Detail 32x32x32 Worley Worley Worley -

Figure 4.5 – The two cloud textures with channels as described inTabel 4.1. Left: Shape. Right: Detail

4.4 Weather Texture

Clouds are controlled by a repeating two dimensional texture with three chan-nels called the weather texture. This texture is repeated over the entire sceneand is also scaled to avoid noticeable patterns of cloud presence and shapes.The coverage which is also used as density is controlled by the red channelin this texture. It is also possible to manipulate the coverage by for examplemodifying it based on the distance to the viewer. Which makes it possible toeither increase or decrease clouds presence at the horizon. The green channelcontrols the height of the clouds. A value of 0 will make the cloud have a

20

Page 21: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

4.4. WEATHER TEXTURE CHAPTER 4. IMPLEMENTATION

height of 0 and therefore not be visible. If the height value is 1 this cloud willhave maximum height value. The blue channel that controls at which altitudeclouds should appear. A value of 0 will yield a start altitude equal to thestart of the atmosphere and a value of 1 makes the cloud appear at maximumcloud layer altitude. These two maximum values are exposed as settings in ourmodule and can easily be controlled at run-time. In Table 4.2 is a descriptionof the different channels in the weather texture.

In Figure 4.6 is an example of a weather texture used during the implementa-tion. The overall yellow tone is due to the coverage and height having similarvalues. The blue which controls altitude is set to zero except for four differ-ent clouds. Therefore clouds using this weather texture will appear at fivedifferent altitudes.

Table 4.2 – Channel description of the weather texture

R G B A

Coverage Height Altitude -

Figure 4.6 – Example of a weather textures and resulting clouds.Left: Weather texture. Right: In-game render using this weather

texture.

4.4.1 Precipitation

Larger water droplets absorbs more light and have a higher presence at the baseof clouds. Since we assume uniform droplet size it is not possible to achievethis by only changing the absorption coefficient. This is even more complicatedsince density also increases over altitude which makes the bottom part of a

21

Page 22: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

4.5. SHAPE DEFINITION CHAPTER 4. IMPLEMENTATION

cloud absorb less than the top, which is opposite of what is expected. Thereforeour rain clouds are darkened by a gradient which is multiplied with lightcontribution. With this simplification all clouds can have the same absorptioncoefficient while the technique is still being able to produce rain and non-rainclouds.

4.4.2 Wind

In the real world clouds can move in different directions at different altitudes.Our technique is limited to a single wind direction applied as an offset to theweather texture. But since we still support the old cloud layer system eachlayer can have wind applied in different directions. It is therefore possibleto have high altitude clouds move in a different direction than low altitudeclouds.

4.5 Shape Definition

This section covers how cloud shapes are defined from noises described inSection 4.2. The base cloud shapes are defined by the first three-dimensionaltexture. The Worley noise channels are summed and multiplied with the Perlinnoise channel.

4.5.1 Density Modifiers

All clouds are modeled using the same approach and the density is indepen-dent of where in the atmosphere it appears. First the coverage value fromweather texture in 4.4 is used as initial density. Then a height signal coveredin Section 5.4 is applied. This height signal lowers density at both the top andthe bottom at the cloud. Then the two three-dimensional textures are used toerode density from the height signal. Thereafter a height gradient is appliedwhich lowers density at the bottom. The height gradient is implemented as alinear increase over altitude. The final density before any lighting calculationsare done must be in the range (0, 1]. Densities that are zero or less after theerosion can be discarded since they will not contribute to any cloud shape.Densities larger than one are clamped to one in order to increase robustnessand make lighting more balanced.

4.5.2 Detailed Edges

Defining a cloud shape from only the shape texture is not enough to get de-tailed clouds. Therefore the smaller detail texture is used to erode cloudsat the edges. The erosion is performed by subtracting noise from the detail

22

Page 23: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

4.5. SHAPE DEFINITION CHAPTER 4. IMPLEMENTATION

texture with Worley noise. By inverting Worley noise it is possible to changebetween wispy and billowing edges or use a combination of both.

The amount of erosion is determined by the density, lower densities are erodedmore. Strength of erosion is implemented as a threshold value which onlyerodes densities lower than this value. Algorithm 4.1 describes the order ofhow the density is calculated.

Algorithm 4.1 – How density is calculated

weather = getWeather ( p o s i t i o n . xz )dens i ty = weather . rdens i ty ∗= HeightS igna l ( weather . gb , p o s i t i o n )dens i ty ∗= getCloudShape ( p o s i t i o n )dens i ty −= getCloudDeta i l ( p o s i t i o n )dens i ty ∗= HeightGradient ( weather . gb , p o s i t i o n )

23

Page 24: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

5Rendering

This section describes how clouds are rendered and lit according to special den-sity distribution presented in previous section. First we describe how cloudsare rendered using ray marching from views below the atmosphere. We thendescribe how this can easily be extended into rendering from views inside theatmosphere and from space. This section also covers how a spherical atmo-sphere was achieved. It then describes how clouds can be rendered with anyheight at any altitude using our improved height signal. We then cover howlighting is applied and how shadows are added.

5.1 Ray Marching

Ray marching is the volumetric rendering technique that we use for renderingclouds. A scene is rendered using this technique by for each pixel march alonga ray and evaluating density, lighting and shadow at each sample point alongthat ray. More sample points yields better results but in turn is more expen-sive. Therefore it is necessary to weigh the step count and step length againsteach other, in order to get desired result.

In our solution we use a step size which is the draw distance divided by thenumber of steps. Draw distance is the minimum value of the atmosphere depthand a artist controlled cutt-off distance. The number of steps is determinedby both quality option as well as the depth of the atmosphere. This makessure that even if the atmosphere would be scaled to be very small or verylarge an appropriate step count will always be used. As seen in Figure 5.1the atmosphere depth is larger near the horizon but the step count is keptconstant. This has the effect that clouds close to the viewer has a smallerstep size which yields better quality. Algorithm 5.1 shows how ray marching

24

Page 25: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

5.2. RENDER FROM INSIDE ATMOSPHERE AND SPACECHAPTER 5. RENDERING

is performed when viewed from planet surface.

Algorithm 5.1 – Ray marching through atmosphere

entry = RaySphereIntersect ( atmosphereStart , directionFromEye )e x i t = RaySphereIntersect ( atmosphereEnd , directionFromEye )stepLength = Distance ( entry , e x i t ) / s t ep ss t e p O f f s e t = random [ 0 , 1 ]t = stepLength ∗ s t e p O f f s e tfor s tep =0; s tep < s t ep s ; ++step

p o s i t i o n = entry + directionFromEye ∗ tdens i ty = GetDensity ( p o s i t i o n )EvaluateLight ( dens i ty )t += stepLength

Figure 5.1 – How atmosphere depth depends on view direction

5.2 Render From Inside Atmosphere and Space

Algorithm 5.1 can easily be extended to render clouds from views within theatmosphere and from space. When we render from space view we use exitpoint in Figure 5.1 as origin and march into the atmosphere. It is not possibleto calculate atmosphere depth by using the distance between entry and exitpoints. This is because when using space view not all rays will intersect withboth the inner and outer bounds of the atmosphere. This is solved by usinga fixed atmosphere depth for rays that only intersected with the outer boundof the atmosphere.

When rendering from within the atmosphere we use the eye position as origininstead of using the entry point. Atmosphere depth is calculated as the dis-tance from the eye to the exit point and with a maximum of cutt-off distance.Automatic change of ray marching algorithm is handled by Cloud Module.

25

Page 26: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

5.3. DOME PROJECTION CHAPTER 5. RENDERING

5.3 Dome Projection

Dome projection is implemented by using ray-sphere intersection as describedin Algorithm 5.2. This algorithm calculates intersecting points of a ray anda perfect sphere. Earth’s atmosphere is shaped as an oblate spheroid e.i. theradius is larger at the equator than at the poles, but in order to simplifycalculations we treat the atmosphere as a perfect sphere instead. By using theposition returned by this algorithm as entry point clouds will naturally bendover the horizon.

Algorithm 5.2 – Ray sphere intersection

a = dot ( d i r e c t i o n , d i r e c t i o n ) ∗ 2b = dot ( d i r e c t i o n , s t a r t P o s i t i o n ) ∗ 2c = dot ( s ta r t , s t a r t )d i s c r im inant = b ∗ b − 2 ∗ a ∗

( c − atmosphereRadius ∗ atmosphereRadius )t = max(0 , (−b + s q r t ( d i s c r im inant ) ) / a )i n t e r s e c t i o n = cameraPos it ion + d i r e c t i o n ∗ t

5.4 Height Signal

The height signal of the clouds is implemented as an parabola function asdescribed in Equation 5.1, where x is the altitude in atmosphere, a is cloudstarting altitude, h is cloud height. This function controls both the altitudeat which the cloud appears and its actual height. The height signal functionhas two roots, one at the starting altitude of the cloud and the other root atthe cloud height. The global maximum value of the height signal is scaledto always be one. By using this parabola function clouds naturally get lessdensity at its top and bottom edges.

(x− a)︸ ︷︷ ︸1st root

· (x− a− h)︸ ︷︷ ︸2nd root

· -4

h2︸︷︷︸Scale

(5.1)

In Figure 5.2 is a graph showing how the density varies with altitude. Inthis graph starting altitude as is 1 and height h is set to 2, resulting in adensity larger than 0 for altitudes between 1 and 3. Algorithm 5.3 shows howthis height signal is implemented in the shader.

26

Page 27: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

5.5. LIGHTING CHAPTER 5. RENDERING

0.5 1 1.5 2 2.5 3 3.5 4

−1

−0.5

0.5

1

altitude

density

Figure 5.2 – Heigth signal with as = 1 and h = 2

Algorithm 5.3 – Height signal implementation

oneOverHeight = 1 / he ighta l t i t u d e D i f f = a l t i t u d e − a l t i t u d e S t a r th e i g h t S i g n a l = a l t i t u d e D i f f ∗ ( a l t i t u d e D i f f − he ight )

∗ oneOverHeight ∗ oneOverHeight ∗ −4

5.5 Lighting

Lighting is evaulated for every sample that returned a density larger thanzero when ray marching through the atmosphere. Our lighting model is asimplification of the radiative transfer function since we do not take emissioninto account. We also do not account for surfaces behind our clouds. Theremaining part of the radiative transfer function that needs to be solved isshown in Equation 5.2.

L(x, ~ω) =

∫ s

0Tr(x, xt)σs(xt)Li(xt, -~ω)dt︸ ︷︷ ︸

S

(5.2)

Transmittance follows the properties of Beer–Lambert’s law as described inEquation 5.3. By using this property the scattered light and transmittancecan be calculated using analytical integration as shown in Algorithm 5.4. Thisis using a scattering integration method proposed by Hillaire [Se15] and is morestable for high scattering values than previous integration techniques.

Tr(x0, x1) = e-∫ x1x0

σt(x)dx

Tr(x0, x2) = Tr(x0, x1) · Tr(x1, x2)(5.3)

27

Page 28: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

5.5. LIGHTING CHAPTER 5. RENDERING

Algorithm 5.4 – Analytical integration

sampleSigmaS = s igmaScat te r ing ∗ dens i tysampleSigmaE = sigmaExt inct ion ∗ dens i tyambient = grad i ent ∗ p r e c i p i t a t i o n ∗ globalAmbientColorS = ( eva luateL ight ( direct ionToSun , p o s i t i o n ) ∗ phase +

ambient ) ∗ sampleSigmaSTr = exp(−sampleSigmaE ∗ s t e p S i z e )

/∗A n a l y t i c a l i n t e g r a t i o n o f l i g h t / t r ans mi t t ancebetween the s t e p s

∗/Sint = (S − S ∗ Tr) / sampleSigmaE

s c a t t e r e d L i g h t += transmit tance ∗ Sintt ransmit tance ∗= Tr

5.5.1 Phase Function

A phase function describes the angular distribution of scattered light. Thephase functions is responsible for cloud lighting effects such as silver lining, fog-bow and glory. In our implementation we use the Henyey-Greenstein [HG41]phase function described in Equation 5.4. The Henyey-Greenstein phase func-tion is well suited for describing the angular distribution of scattered light inclouds as it offers a very high forward-scattering peak. This forward peak isthe strongest optical phenomena found in clouds. Another phase function weconsidered using was one presented by Cornette-Shank [CS92] and is describedin Equation 5.5. This phase function is also well suited for clouds but is moretime consuming to calculate [FZZZ14].

Both the Henyey-Greenstein and the Cornette-Shank phase function does notproduce back-scattered optical phenomenas as the Mie phase function does.A comparison between the Henyey-Greenstein and the Mie phase function isdescribed in Figure 5.3. We did not use the Mie phase function due to its os-cillating behavior which is undesired because it could cause banding artifacts.

pHG(θ) =1 − g2

4π · (1 + g2 − 2g · cos(θ))1.5(5.4)

pCS(θ) =3

2

(1 − g2)

(2 + g2)

(1 + cos2(θ))

(1 + g2 − 2g · cos(θ))1.5(5.5)

28

Page 29: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

5.5. LIGHTING CHAPTER 5. RENDERING

Figure 5.3 – Plot of phase functions for light scattering inclouds [BNM+08]. Blue: Mie, Green: Henyey-Greenstein with

g = 0.99

5.5.2 Shadows

Self- and inter-cloud shadowing is implemented by for each step inside a cloudalso step towards the sun as shown in Figure 5.4. The self- and inter-cloudshadowing effect is expensive although necessary to convey a sense of depth.Marching towards the sun has a great impact on performance since for everystep an extra number of steps has to be taken. In our implementation foursteps towards the sun are taken at exponentially increasing steps size. Thisis because we want most of the contribution to be from the cloud itself whilestill receiving shadows from other clouds. In Figure 5.5 is an in-game renderedimage showing clouds with both self- and inter-cloud shadowing.

Figure 5.4 – Exponentially increasing step size taken towards thesun

29

Page 30: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

5.5. LIGHTING CHAPTER 5. RENDERING

Figure 5.5 – In-game render showing inter cloud shadowing

5.5.3 Ambient

Ambient light contribution is added using a linear gradient from the cloudsbottom to the top and increases with altitude. The strength as well as colorof the ambient contribution is controlled by the Cloud Module. How ambientcontribution is added is described in Algorithm 5.4.

30

Page 31: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

6Optimization

Several optimizations has been implemented in order for this cloud renderingtechnique to achieve real-time performance. This section covers most of thesedifferent optimizations.

Render Target ResolutionDecreasing the render targets resolution to half of the original resolu-tion is one optimization that was necessary for the rendering techniqueto achieve real-time performance. Using an even lower resolution mayprovide an even greater performance boost at the cost of a blurrier result.

Shape Texture ResolutionThe first implementation had the same resolution in all axises for theshape texture. But since this texture is repeated more in x and z thany. We reduce the amount of resources required for this texture by usinga quarter of the resolution for the y axis.

Temporal UpsamplingBy using temporal upsampling it is possible to take fewer steps whenmarching without reducing visual quality of the clouds. How temporalupsampling was implemented is covered later in this section.

Early ExitIt is possible to early exit in the shader if the transmittance is eitherhigh or low as described later in this section. In the rendered imagesin Section 7.3 with visual results marching was terminated when thetransmittance was less than 0.01.

31

Page 32: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

6.1. TEMPORAL UPSAMPLING CHAPTER 6. OPTIMIZATION

6.1 Temporal Upsampling

This section describes how temporal upsampling was implemented and howthe technique improved the visual quality. In Figure 6.1 is a scene with-out temporal upsampling and in Figure 6.2 is the same scene rendered withtemporal upsampling. Both figures are rendered using the same step countand length, the only difference is the temporal upsampling. Notice how thebanding artifacts disappears when temporal upsampling is enabled. This isbecause multiple frames have been blended together and each starting stepwhen ray marching was different. The different steps for temporal upsamplingis described below.

1. Offset the starting step in the ray marching algorithm by multipling thestep length with a value in the range [0,1], as described in Algorithm 5.1.This value is provided by the cloud module and is calculated as a Vander Corput sequence.

2. Calculate world position of P from clip space position using currentinverse view projection matrix.

Pwp = Pcs · (V P )−1

3. Calculate, P ′cs, which is position of P in the previous frame. This is doneby multiplying Pwp with view projection matrix from previous frame.

P ′cs = Pwp · (V P )

4. Sample previous frame at position P ′.

CP ′ = texture(P ′)

5. Blend current frame with previous frame using a blending factor α, whichis TemporalAlpha in AppendixB. Low values of α results in less bandingbut takes more time to converge. We use 5% as the default blendingvalue. If the sample is outside of the screen we set α to 1, which resultsin only the current frame being rendered. In order to hide banding whenwe cannot blend with the previous frame we increase the step count forthis pixel.

CP · α+ CP ′ · (1 − α)

32

Page 33: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

6.2. EARLY EXIT CHAPTER 6. OPTIMIZATION

Algorithm 6.1 – Temporal upsampling

c l ipSpacePos = computeClipPos ( screenPos )camtoWorldPos = ( cl ipSpacePos , 1) ∗ invViewProject ionMatr ixcamtoWorldPos /= camtoWorldPos .wpPrime = camtoWorldPos ∗ ViewProject ionMatr ixpPrime /= pPrime .wscreenPos = computeScreenPos ( pPrime )isOut = any ( abs ( screenPos . xy − 0 . 5 ) > 0 . 5 )cur rent = sca t t e r edL ight , t ransmit tance . xr e s u l t = isOut ? cur rent : cur rent ∗ alpha

+ prevo ius ∗ (1 − alpha )

Figure 6.1 – Without temporal upsampling

6.2 Early Exit

Performing as few calculations as possible is key to performance. This sectiondescribes how we use early exit points in our shader to improve performance.

6.2.1 Low Transmittance

While marching through the atmosphere every sample with a non-zero densitywill reduce the transmittance, which is loss of energy. Therefore as transmit-tance gets lower new samples will contribute less to the final result. We stopmarching when transmittance is less than 0.01. This provided a great perfor-mance increase without noticeable visual artifacts.

33

Page 34: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

6.2. EARLY EXIT CHAPTER 6. OPTIMIZATION

Figure 6.2 – With temporal upsampling

Figure 6.3 – Red parts showing when early exit was done due tolow transmittance

6.2.2 High Transmittance

When transmittance is high it means that few samples along this ray hada density larger than zero. By using the position calculated for temporalupsampling we can sample transmittance from the previous result and checkwhether there was a cloud in this direction. If transmittance is less than 1.0there is a cloud in this direction, but due to wind we cannot assume thata direction with transmittance of 1.0 has not any clouds. Always exitingon high transmittance would not render all clouds if they move into a regionwhere previous frame had high transmittance. Instead we early exit before any

34

Page 35: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

6.2. EARLY EXIT CHAPTER 6. OPTIMIZATION

marching is done if transmittance in previous frame is 1.0 and then force thispixel to be marched in the following frame. This could be done automaticallyusing a checkered tile pattern instead to avoid one frame being a lot moreexpensive than the other.

Figure 6.4 – Red parts showing when early exit was done due tohigh transmittance

35

Page 36: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

7Results

This section presents in detail our results from this implementation with re-sources required, run-time performance and finally a visual comparison. Thesevisual results are focused on clouds rendered from the planets surface. Addi-tional visual results from inside the atmosphere and from space are presentedin Appendix A.

7.1 Resources

Table 7.1 shows the amount of resources that this technique requires. The re-sources required for render tagets is based on using half resolution of 1920x1080while rendering.

Table 7.1 – Resources required

Texture Format Size Memory

Detail RGB8 32x32x32 0.1MB

Weather RGB8 1024x1024 3.1MB

Shape RGBA8 128x32x128 2.1MB

RenderTarget A RGBA16 960x540 4.1MB

RenderTarget B RGBA16 960x540 4.1MB

Total 13.5MB

36

Page 37: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

7.2. PERFORMANCE CHAPTER 7. RESULTS

7.2 Performance

This section describes run-time performance measured on a PC using a built-inperformance diagnostics tool. The hardware configuration used during perfor-mance evaluation is shown in Table 7.2.

Table 7.2 – Hardware configuration used when evaluatingperformance

Hardware Release year

XFX Radeon HD7870 2012

Intel Xeon E5-1650 2012

Most computations are performed in the pixel shader on the GPU. Executiontimes are measured on the GPU using a built-in diagnostics tool. The envi-ronment used when measuring was an empty level which only had clouds init. In Figure 7.1 is execution time shown when maximum draw distance has aconstant value of 20 000m and view direction is upwards. In Figure 7.2 is thewith maximum draw distance altered and atmosphere depth has a constantvalue of 2 000m and view direction is towards the horizon for a comparison ofhow rendering distance affects run-time performance. When coverage is highalmost the entire sky is covered by clouds and when coverage is set to low onlya small part of the screen has clouds in it. As expected a low coverage is muchfaster mostly due to the early exit on high transmittance.

37

Page 38: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

7.2. PERFORMANCE CHAPTER 7. RESULTS

Figure 7.1 – Run-time performance as function of atmospheredepth.

Figure 7.2 – Run-time performance as function of max renderdistance.

38

Page 39: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

7.3. VISUAL RESULTS CHAPTER 7. RESULTS

7.3 Visual Results

In this section we present visual results from our cloud rendering techniquetogether with photographs for comparison.

cb

aWikim

edia

Commons

Figure 7.3 – Yellow sunset photograph

Figure 7.4 – Yellow sunset in-game render

39

Page 40: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

7.3. VISUAL RESULTS CHAPTER 7. RESULTS

Figure 7.5 – Cumulus clouds in-game render

cb

aWikim

edia

Commons

Figure 7.6 – Cumulus clouds photograph

40

Page 41: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

8Discussion and Conclusion

The cloud rendering technique we have presented can produce lots of cloudsunder fully dynamic lighting conditions. The technique achieved real-timeperformance through several optimizations and can be used in games as seenin Figure A.6, where it was added to Battlefield 4. Although the techniqueis not yet production ready as it missing important features like god rays,reflection views and the clouds does not cast shadows on the environment.In order to implement these features the technique will most likely need ad-ditional optimizations. During the implementation we introduced a distancefield together with the weather texture that would store the distance to theclosest cloud. This distance field was pre-generated using the midpoint circlealgorithm to find the closest red pixel in the weather texture and store thisinside a different texture. The distance field increase run-time performance insome cases but in others it got worse and was therefore removed.

One optimization that seems very promising but has not been implementedyet is to pre-compute shadows. Since we assume that the direction to thesun is parallel, shadows could be pre-calculated and stored in a look-up table.This look-up table would only need to be updated when the sun has moved.Since we take four additional samples towards the sun for every sample thiswould greatly reduce the number of samples.

The weather texture shown in Figure 4.6 has an overall yellow tone due tored and green having similar values. One optimization to reduce the size ofthis texture could be to remove one of these channels and use the same forcoverage and height. This would not allow scenes to have both low and high

41

Page 42: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

8.1. LIMITATIONS CHAPTER 8. DISCUSSION AND CONCLUSION

clouds with high density, but if it might increase run-time performance.

The Henyey-Greenstein phase function provides a strong forward peek whichis necessary for a silver lining effect although its approximation. A Mie phasefunction for cumulus clouds were generated using MiePlot and stored in a tex-ture for use in the shader. Our results from this were an oscillating behaviourfrom back-scattered light and reduced run-time performance Therefore we in-stead use Henyey-Greenstein, but using Mie phase function would be a bettersolution for providing more physically accurate result.

The Mie phase function was tested by generating it using MiePlot [Lav45]and storing it in a look-up table which was then used in the shader. This gavepromising results but due to the Mie phase functions oscillating behaviourbanding occurred. This was most visible when viewing along the sun direc-tion.

8.1 Limitations

This cloud rendering does not support clouds that overlap in altitude. This isbecause the weather texture only stores coverage, height and starting altitude.By using multiple weather textures this limitation could be resolved, but itwould reduce run-time performance.

The phase function we use produces a strong forward peak but does not pro-duce back-scattering effects as glory and fogbow. We also only take sun andambient light into account and not other light sources.

8.2 Ethical Aspects

This cloud rendering technique can produce clouds for many different weathersituations and in a game context changing the weather could affect both themood and behavior of the players. A more general ethical aspect relatedcomputer graphics is photo and video manipulation and falsification.

42

Page 43: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

9Future work

In our future work we will optimize the algorithm and implement currentlymissing features such as render into reflection views and cast shadows on theenvironment. The clouds should also be affected by aerial perspective andfog to allow for a smoother transition into the horizon. Another part that isleft for future work is to implement the cloud rendering technique on otherplatforms such as Playstation and Xbox.

43

Page 44: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

References

[AN15] Schneider Andrew and Vos Nathan. The real-time volumetric cloudscapesof horizon zero dawn. In SIGGRAPH, aug 2015.

[BNL06] Antoine Bouthors, Fabrice Neyret, and Sylvain Lefebvre. Real-time re-alistic illumination and shading of stratiform clouds. In EurographicsWorkshop on Natural Phenomena, sep 2006.

[BNM+08] Antoine Bouthors, Fabrice Neyret, Nelson Max, Eric Bruneton, and CyrilCrassin. Interactive multiple anisotropic scattering in clouds. In Proceed-ings of the 2008 Symposium on Interactive 3D Graphics and Games, I3D’08, pages 173–182, New York, NY, USA, 2008. ACM.

[BS14] Gustav Bodare and Edvard Sandberg. Efficient and DynamicAtmospheric Scattering. Chalmers University of Technology,http://publications.lib.chalmers.se/records/fulltext/203057/203057.pdf,2014.

[Cha60] Subrahmanyan Chandrasekhar. Radiative Transfer. Dover PublicationsInc, 1960.

[CS92] William M. Cornette and Joseph G. Shanks. Physically reasonable an-alytic expression for the single-scattering phase function. Appl. Opt.,31(16):3152–3160, Jun 1992.

[FZZZ14] Xiao-Lei Fan, Li-Min Zhang, Bing-Qiang Zhang, and Yuan Zha. Real-time rendering of dynamic clouds. JOURNAL OF COMPUTERS, 25(3),2014.

[Gus05] Stefan Gustavson. Simplex noise demystified. Linkoping University,http://webstaff.itn.liu.se/˜stegu/simplexnoise/simplexnoise.pdf, 2005.

[HG41] L. G. Henyey and J. L. Greenstein. Diffuse radiation in the Galaxy.Astrophys Journal, 93:70–83, January 1941.

44

Page 45: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

REFERENCES REFERENCES

[HH12] Roland Hufnagel and Martin Held. A survey of cloud lighting and ren-dering techniques. In Journal of WSCG 20, 3, pages 205–216, 2012.

[Jar08] Wojciech Jarosz. Efficient Monte Carlo Methods for Light Transport inScattering Media. PhD thesis, UC San Diego, September 2008.

[Lav45] Philip Laven. MiePlot. A computer program for scatteringof light from a sphere using Mie theory & the Debye series,http://www.philiplaven.com/mieplot.htm, v4.5.

[Mh11] Marco KOK Mang-hin. Colours of Clouds. Hong Kong Observatory,http://www.weather.gov.hk/education/edu06nature/ele cloudcolours e.htm,2011.

[Per85] Ken Perlin. An image synthesizer. In ACM SIGGRAPH Computer Graph-ics: Volume 19 Issue 3, Jul 1985.

[Sal13] Marco Salvi. Pixel Synchronization. Advanced Rendering Technology In-tel - San Francisco, http://advances.realtimerendering.com/s2013/2013-07-23-SIGGRAPH-PixelSync.pdf, 2013.

[Se15] Hillaire Sebastien. Physically-based & unified volumetric rendering infrostbite. In SIGGRAPH, aug 2015.

[Wor96] Stewen Worley. A cellular texture basis function. In Proceedings of the23rd annual conference on computer graphics and interactive techniques.,page 291–294, 1996.

[Yus14] Egor Yusov. High-performance rendering of realistic cumulus clouds usingpre-computed lighting. In Proceedings of High-Performance Graphics,2014.

45

Page 46: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

AAdditional Visual Results

Figure A.1 – Sunset with very high start multiplier

46

Page 47: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

APPENDIX A. ADDITIONAL VISUAL RESULTS

Figure A.2 – Clouds viewed from space

Figure A.3 – Clouds viewed from inside atmosphere at day

47

Page 48: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

APPENDIX A. ADDITIONAL VISUAL RESULTS

Figure A.4 – Sunset viewed from inside atmosphere

Figure A.5 – Sunset viewed from inside atmosphere

48

Page 49: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

APPENDIX A. ADDITIONAL VISUAL RESULTS

Figure A.6 – Cumulus clouds in Battlefield 4

49

Page 50: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

BArtist Controls

This section lists controls with their names and description that were added to ournew cloud module for modifying clouds. The controls are grouped together in sectionsby what they control. All these controls can be dynamically changed during run-time.Controls for wind and sky are not listed since they are not part of this module. InFigure B.1 is the editor FrostEd with different modules and exposed controls for thecloud module displayed.

50

Page 51: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

APPENDIX B. ARTIST CONTROLS

Figure B.1 – Editor showing selected Cloud module with exposedcontrols

51

Page 52: Convincing Cloud Renderinguffe/xjobb/RurikHögfeldt.pdf · Rendering clouds without taking physical behavior into account will not yield convincing and realistic results [BNM+08]

APPENDIX B. ARTIST CONTROLS

Erode

• BaseAndTopMultiplier Modifies strength of a parabola function used as heightsignal. This is the extrema value of this function.

• BaseToTopMultiplier Modifies strength of the linear height gradient.

• EdgeDetailThreshold Threshold for specifying at which density details shouldbe added.

• ThicknessMultiplier Increases density of the cloud shape before erosion.

General

• Enable Enables the cloud module.

• ShapeTexture 3D Shape texture.

• DetailTexture 3D Detail texture.

• WeatherTexture 2D Weather texture.

• CloudQuality This is a quality option that allows for a trade-off between perfor-mance and visual appearance. It adjusts how many samples are taken whenrendering.

Height

• HeightMultiplier Specifies the maximum cloud height in meters.

• StartMultiplier Specifies the maximum starting altitude of a cloud in meters.

• PlanetRadius Radius of the planet in meters.

• AtmosphereStart Starting distance of the atmosphere from the planet’s surface.

Horizon

• CutOffDistance Maximum distance from camera that clouds are rendered in.

Lighting

• Absorption Color and strength of absorption coefficient.

• Scattering Color and strength of scattering coefficient.

• Phase Paramater g in the Henyey Greenstein phase function.

• AmbientMulitplicator Strength of ambient contribution.

• AmbientDesaturate Modifies the saturation of ambient color.

Scale

• ShapeScale Scaling value for shape texture.

• DetailScale Scaling value for detail texture.

• WeatherScale Scaling value for weather texture.

Temporal

• TemporalAlpha Blending value for temporal upsampling between current andprevious frame.

Weather

• Precipitation Adds preciptation effect to clouds.

• Offset Offsets the weather texture.

52