CS580: Computer Graphicsvclab.kaist.ac.kr/cs580/slide13-MonteCarloIntegration(1).pdf ·...

Preview:

Citation preview

Min H. Kim KAIST CS580 Computer Graphics

CS580: Computer Graphics

Min H. KimKAIST School of Computing

Min H. Kim KAIST CS580 Computer Graphics

Elements of Computer Graphics

2Rendering

Material modelGeometry Light

Virtual photography

Min H. Kim KAIST CS580 Computer Graphics

PATH TRACINGIntroduction

3

Min H. Kim KAIST CS580 Computer Graphics

Heckbert Notation• Paths are written as regular expressions• Light, Diffuse, Specular and Eye• Operators:

• Examples:– Ray tracing: LD[S*]E– Radiosity: LD*E

• Complete global illumination solution: L(D|S)*E4

| Stands for "or",* Represents any number of reflections

on surfaces of the type specified,+ Stands for at least one reflection,? Stands for at most one reflection.

Min H. Kim KAIST CS580 Computer Graphics

Heckbert Notation Examples

5

Min H. Kim KAIST CS580 Computer Graphics

Ray Tracing• The light paths expand in a tree-form– Each intersection of a ray with a surface can spawn

two further rays (reflected and refracted) which each can spawn two further rays and so on.

• The contribution to the final image of deep layers of the tree can be very little, yet they still contribute very much to the time.

6

Min H. Kim KAIST CS580 Computer Graphics

Ray Casting• Cast a ray from the eye through each pixel

7

Min H. Kim KAIST CS580 Computer Graphics

Ray Casting• Cast a ray from the eye through each pixel• Trace secondary rays (light, reflection,

refraction)

8

Min H. Kim KAIST CS580 Computer Graphics

Monte-Carlo Ray Casting• Cast a ray from the eye through each pixel• Cast random rays from the visible point– Accumulate radiance contribution

9

Min H. Kim KAIST CS580 Computer Graphics

Monte-Carlo Ray Casting• Cast a ray from the eye through each pixel• Cast random rays from the visible point• Recurse

10

Min H. Kim KAIST CS580 Computer Graphics

Monte-Carlo Ray Casting• Cast a ray from the eye through each pixel• Cast random rays from the visible point• Recurse

11

Min H. Kim KAIST CS580 Computer Graphics

Monte-Carlo Ray Casting• Systematically sample primary light

12

Min H. Kim KAIST CS580 Computer Graphics

Results

13

Min H. Kim KAIST CS580 Computer Graphics

Path Tracing• We generate paths of the form L(D|S)*E and

each new ray is chosen stochastically, according to the material properties (BRDF) of the surface from which it is generated.

14

Min H. Kim KAIST CS580 Computer Graphics

Monte-Carlo Path Tracing• Trace only one secondary ray per recursion• But send many primary rays per pixel

(performs antialiasing as well)

15

Min H. Kim KAIST CS580 Computer Graphics

Results

16

Think about it : we compute aninfinite-dimensional integral with 10 samples!!!

10 paths/pixel

Min H. Kim KAIST CS580 Computer Graphics

Results: Glossy (10 paths/pixel)

17

10 paths/pixel

Min H. Kim KAIST CS580 Computer Graphics

Results: Glossy (100 paths/pixel)

18

100 paths/pixel

Min H. Kim KAIST CS580 Computer Graphics

MONTE CARLO INTEGRATIONAdvanced Global Illumination, Chapter 3

19

Min H. Kim KAIST CS580 Computer Graphics

What can we integrate?• At each bounce, we recursively estimate

irradiance of sampling radiance from different angle directions.

• In addition to depth of field, motion blur, indirect lighting, soft shadows, gloss reflection, etc.

20

Motion blurDepth of field Indirect lighting Gloss reflectionSoft shadows

Min H. Kim KAIST CS580 Computer Graphics

How to calculate the area of 𝜋

21

• Circle area: • 𝜋:• Without knowing the area,• We pick random samples• Then test if they belong to the

circle boundary

• Increasing N to infinite to get the more accurate estimation

A = πr 2

2 2

4 , where4

A Q AQr r

p = = =

π = 4Qr 2 = (area of Q)

(area of r 2 )≈ (prob. of Q)

(prob. of r 2 )= 4

1114

= 3.1428

n(blue)=11n(red)=3

r

r

Q

A

è Monte Carlo Technique

Min H. Kim KAIST CS580 Computer Graphics

Monte Carlo Technique• Pros:– It’s conceptually simple.– Once an appropriate random variable is found,

the computation consists of 1. Sampling the random variable2. Averaging the estimates obtained from the sample.

– Applicable to a wide range of problems.• Cons:– Relatively slow convergence rate– Variance reduction techniques are required.– (four times more samples are required to decrease the

error of the Monte Carlo computation by half.)

22

1N

Min H. Kim KAIST CS580 Computer Graphics

Probability Theory• The probability pi of an event lies between 0 and 1:– If an outcome never occurs, pi is 0; If an event always occurs, pi is 1.

• The probability that either of two different events occurs:

Two events are mutually exclusive if and only if the occurrence of one of the events implies the other event cannot possibly occur.

• A set of all the possible events (mutually exclusive):

23

0 ≤ pi ≤1

Pr(Event1 or Event2 ) ≤ Pr(Event1)+Pr(Event2 )

Pr(Event1 or Event2 ) = Pr(Event1)+Pr(Event2 )

pii∑ = 1

Min H. Kim KAIST CS580 Computer Graphics

Probabilities and Expected Values• Given a discrete random variable xi with

probabilities pi

• Expected value:

• E.g., for the dice we have –– Expected value is:

24

E(x) = xi pii

n

x = 1,2,3,4,5,6{ }, pi =16

(1+ 2 + 3+ 4 + 5 + 6)× 16= 3.5

Min H. Kim KAIST CS580 Computer Graphics

Variance and Standard Deviation• The variance σ2 is a measure of the deviation of the

outcomes from the expected value of the random variable.– The expected value of the square difference between the

outcome of the experiment and its expected value.

• The standard deviation σ is the square root of the variance.

25

σ 2 = E[(x − E[x])2 ]= (xi − E[x])2 pi .

i∑

σ 2 = E[x2 ]− (E[x])2 = xi2pi

i∑ − xi pi

i∑⎛⎝⎜

⎞⎠⎟

2

.

σ die2 = 1

6× 1− 3.5( )2 + 2 − 3.5( )2 + 3− 3.5( )2 + 4 − 3.5( )2 + 5 − 3.5( )2 + 6 − 3.5( )2⎡⎣ ⎤⎦

= 2.91

Min H. Kim KAIST CS580 Computer Graphics

Probability Density Function • For a real-valued (continuous) random variable

x, a probability density function (PDF) p(x).• The probability that the variable takes a value x

in the interval [x,x+dx] equals p(x)dx.• A cumulative distribution function (CDF):

– The probability with which an event occurs with an outcome whose value is less than or equal to the value y.

– Non-decreasing and non-negative26

P(y) = Pr(x ≤ y) = p(x)dx.−∞

y

Min H. Kim KAIST CS580 Computer Graphics

Probability Density Function • PDF properties:

27

∀x : p(x) ≥ 0 (note that p(x) could be larger than 1.0)

p(x)dx−∞

∫ = 1

p(x) = dP(x)dx

p(z)dza

b

∫ = Pr(a ≤ x ≤ b) = Pr(x ≤ b)− Pr(x ≤ a)

p(z)dza

b

∫ = Pr(a ≤ x ≤ b) = P(b)− P(a)

∵ CDF: P(x) = p(t)dt−∞

x

Min H. Kim KAIST CS580 Computer Graphics

Probabilities and Expected Values• Given a discrete random variable xi with

probability density function p(x)• Expected value: • Now given a function f(x) with random variable x,

which has p(x)– Expected value: – Variance:

– Discretized:

28

E[x]= xp(x)dx∫

E[ f (x)]= f (x)p(x)dx∫

E[ f (x)]≈ 1N

f (xi )i∑

(works for independent identically distributed random variables)

σ 2 = E[( f (x)− E[ f (x)])2]= ( f (x)− E[ f (x)])2 p(x)dx.∫

Min H. Kim KAIST CS580 Computer Graphics

Uniform Probability Distribution• A uniform PDF:

• The probability that

29

pu (x) =1

b − a.

x ∈[a ',b ']

Pr(x ∈[a ',b ']) = 1b − a

dx;a '

b '

∫Pr(x ∈[a ',b ']) = b '− a '

b − a;

Pr(x ≤ y) = P(y) = 1b − a

dx−∞

y

∫Pr(x ≤ y) = P(y) = y − a

b − a

Min H. Kim KAIST CS580 Computer Graphics

Conditional & Marginal Probability• Joint probability distribution function p(x,y)• Marginal density function of x:

• Conditional density function p(y|x)the probability of y given some x:

• Conditional expectation of a random function g(x,y)

30

p(x) = p(x, y)dy.∫ i ijj

p p=å

p(y | x) = p(x, y)p(x)

= p(x, y)p(x, y)dy∫

.

E[g | x]= g(x, y)p(y | x)dy =∫g(x, y)p(x, y)dy∫

p(x, y)dy∫.

Min H. Kim KAIST CS580 Computer Graphics

Estimator: G(x)=E[g(x)]• The weighted sum of N independent random variables

g(xj)

• When the weights wj are the same,

• Linearity property• The expected value of G, estimator, is

31

G = wjgjj=1

N

E[G(x)]= wjE[gj (x)]j∑

G(x) = wjgj (x)j=1

N

∑ = 1Ngj (x)

j=1

N

∑ = 1N

gj (x)j=1

N

E[G(x)]= wjE[gj (x)]j∑ = 1

NE[gj (x)]

j=1

N

∑ = 1N

E[g(x)]j=1

N

= 1NNE[g(x)]= E[g(x)]

Min H. Kim KAIST CS580 Computer Graphics

Variance of Estimator• Variance of G:

• In general, variance follows:

• In the case of independent random variables, the covariance is 0.

32

σ 2[G(x)]=σ 2 gi (x)Ni=1

N

∑⎡⎣⎢

⎤⎦⎥

σ 2[x + y]=σ 2[x]+σ 2[y]+ 2Cov[x, y]

Cov[x, y]= E[xy]− E[x]⋅E[y]

σ 2[x + y]=σ 2[x]+σ 2[y]σ 2[ax]= a2σ 2[x]

Min H. Kim KAIST CS580 Computer Graphics

Variance of Estimator• Variance of G:

• Therefore,

• As N increases, the variance of G decreases with N, making G an increasingly good estimator of E[g(x)].

• Thus, standard deviation error σ decreases as

33

σ 2[G(x)]= σ 2[gi (x)]N 2

i=1

N

σ 2[G(x)]= N σ 2[gi (x)]N 2 = σ 2[gi (x)]

N

N

σ 2[G(x)]=σ 2 gi (x)Ni=1

N

∑⎡⎣⎢

⎤⎦⎥

Min H. Kim KAIST CS580 Computer Graphics

Monte Carlo Integration• This describes a simple technique for the

numerical evaluation of integrals• Suppose:

• The Monte-Carlo converts this to an expected value computation problem, where p(x) is some probability density function:

34

I = f (x)dxa

b

∫ , x ∈[a,b]

I = f (x)p(x)

⎛⎝⎜

⎞⎠⎟p(x)dx∫ = E f (x)

p(x)⎡⎣⎢

⎤⎦⎥

e.g.,hemispherical irradiance I = f (ω )dω0

Min H. Kim KAIST CS580 Computer Graphics

Monte Carlo Integration• Values can be estimated by taking N samples x1,

x2, …, xn drawn from PDF p(x):

• The variance is proportional to 1/N• 𝛔 Error (=stddev.) decrease: 1/sqrt(N)

35

I = E f (x)p(x)

⎡⎣⎢

⎤⎦⎥

≈ I = 1N

f (xi )p(xi )i=1

N

Min H. Kim KAIST CS580 Computer Graphics

Monte Carlo Integration• The expected value of the estimator <I>

36

E I⎡⎣ ⎤⎦ = E1N

f (xi )p(xi )i=1

N

∑⎡

⎣⎢

⎦⎥

= 1N

E f (xi )p(xi )

⎣⎢

⎦⎥

i=1

N

= 1NN f (x)

p(x)p(x)dx∫

= f (x)dx∫= I

Min H. Kim KAIST CS580 Computer Graphics

Example

• We know it should be 1.0.• Here with uniform

sample (p(x)=1):

37

I = 5x4 dx0

1

I ≈ I = 1N

5xi4

1i=1

N

∑ σ2

-σ2

Number of samples

Erro

r

σ est2 = 1

N(5x4 −1)2 dx

0

1

∫ = 169N

σ 2 = 1N

f (x)p(x)

− I⎛⎝⎜

⎞⎠⎟

2

p(x)dx.∫

Min H. Kim KAIST CS580 Computer Graphics

Bias• When the expected value of the estimator E[<I>]

is exactly the value of the integral Ià unbiased

• Bias is:

• When N is infinite,

38

B I⎡⎣ ⎤⎦ = E I⎡⎣ ⎤⎦ − I

limN→∞

B I⎡⎣ ⎤⎦ = 0

Min H. Kim KAIST CS580 Computer Graphics

Accuracy• Chebyshev’s Inequality– The probability that a sample deviates form the

solution by a value greater than , is smaller than , where is an arbitrary positive number.

– If

39

σ 2

δδ

Pr I − E I[ ] ≥ σ 2

δ⎡

⎣⎢⎢

⎦⎥⎥≤ δ

δ = 110000

,

Pr I − E I[ ] ≥ 100σ primary

N⎡⎣⎢

⎤⎦⎥≤ 110000

δ

Recommended