• 沒有找到結果。

Direct lighting via Monte Carlo integration

N/A
N/A
Protected

Academic year: 2022

Share "Direct lighting via Monte Carlo integration"

Copied!
126
0
0

加載中.... (立即查看全文)

全文

(1)

Surface Integrators

Digital Image Synthesis g g y Yung-Yu Chuang

with slides by Peter Shirley, Pat Hanrahan, Henrik Jensen, Mario Costa Sousa and Torsten Moller

(2)

Direct lighting via Monte Carlo integration

diff

diffuse

(3)

Direct lighting via Monte Carlo integration

parameterization over hemisphere

parameterization over surface

have to add visibility have to add visibility

(4)

Direct lighting via Monte Carlo integration

take one sample according to a density function

let’s take let s take

(5)

Direct lighting via Monte Carlo integration

1 sample/pixel

100 samples/pixel Lights’ sizes matter more than shapes. p Noisy because

•x’ could be on the b k

back

•cos varies

(6)

Noise reduction

choose better density function

) 1

(   

 

It is equivalent to uniformly sampling over the cone cap in the last lecture.

max 1

1

) cos

1 (

cos        2 

2

 

cos 

max

(7)

Direct lighting from many luminaries

• Given a pair , use it to select light and generate new pair for sampling that light generate new pair for sampling that light.

• α could be constant for proportional to power

(8)

Main rendering loop

void Scene::Render() {

Sample *sample = new Sample(surfaceIntegrator, volumeIntegrator volumeIntegrator, this);

...

while (sampler >GetNextSample(sample)) { while (sampler->GetNextSample(sample)) {

RayDifferential ray;

float rW = camera->GenerateRay(*sample, &ray);

<Generate ray differentials for camera ray>

<Generate ray differentials for camera ray>

float alpha;

Spectrum Ls = 0.f;

if (rW > 0 f) if (rW > 0.f)

Ls = rW * Li(ray, sample, &alpha);

...

camera->film->AddSample(*sample ray Ls alpha);

camera >film >AddSample( sample,ray,Ls,alpha);

...

} ...

...

camera->film->WriteImage();

}

(9)

Scene::Li

Spectrum Scene::Li(RayDifferential &ray,

Sample *sample float *alpha) Sample *sample, float *alpha) {

Spectrum Lo=surfaceIntegrator >Li( );

Spectrum Lo=surfaceIntegrator->Li(…);

Spectrum T=volumeIntegrator->Transmittance(…);

Spectrum Lv volumeIntegrator >Li( );

Spectrum Lv=volumeIntegrator->Li(…);

return T * Lo + Lv;

} }

L

o

Lv

T

(10)

Surface integrators

• Responsible for evaluating the integral equation

/ * i /*

• core/transport.* integrator/*

Whitted, directlighting, path, bidirectional,

i di h h t

irradiancecache, photonmap igi, exphotonmap

class COREDLL Integrator {

Spectrum Li(Scene *scene, RayDifferential

&ray, Sample *sample, float *alpha);

void Proprocess(Scene *scene)

void RequestSamples(Sample*, Scene*) };

class SurfaceIntegrator : public Integrator

(11)

Surface integrators

• void Preprocess(const Scene *scene) Called after scene has been initialized; do scene Called after scene has been initialized; do scene- dependent computation such as photon shooting for photon mapping.

p pp g

• void RequestSamples(Sample *sample, const Scene *scene)

Sample is allocated once in Render(). There, sample’s constructor will call integrator’s RequestSamples to

ll t i t

allocate appropriate space.

Sample::Sample(SurfaceIntegrator *surf,

VolumeIntegrator *vol const Scene *scene) { VolumeIntegrator vol, const Scene scene) { // calculate required number of samples

// according to integration strategy surf->RequestSamples(this, scene);

...

(12)

Direct lighting

d L

f L

L ( ) ( )  ( ) ( ) |  |

Rendering equation

i i

i i

i o o

e o

o

p L p f p L p d

L ( ,  )  ( ,  )  

( ,  ,  ) ( ,  ) | cos  |  If we only consider direct lighting, we can replace

i i

i d

i o o

e o

o

p L p f p L p d

L ( ,  )  ( ,  )  

( ,  ,  ) ( ,  ) | cos  | 

y g g, p

L

i

by L

d

.

i i

i d

i o o

e o

o

• simplest form of equation

• somewhat easy to solve (but a gross approximation)

• kind of what we do in Whitted ray tracing

• Not too bad since most energy comes from direct lights

(13)

Direct lighting

• Monte Carlo sampling to solve

• Sampling strategy A: sample only one light

i i

i d

i

o

L p d

p

f ( ,  ,  ) ( ,  ) | cos  | 

• Sampling strategy A: sample only one light

– pick up one light as the representative for all lights distribute N samples over that light

– distribute N samples over that light

– Use multiple importance sampling for f and L

d

N

f ( ) L ( ) |  |

N

j j

j j

d j

o

p

p L

p f

N

1

( )

| cos

| ) ,

( )

, ,

1 (

– Scale the result by the number of lights N

L

Randomly pick f or g and then sample,

] [ f

E

j

Randomly pick f or g and then sample, multiply the result by 2

]

[ f g

E

(14)

Direct lighting

• Sampling strategy B: sample all lights

d A f h li ht – do A for each light – sum the results

t ld b t l li ht di t – smarter way would be to sample lights according to

their power



NL

j

i i

i j

d i

o

L p d

p f

1

)

(

( , ) | cos | )

, ,

(     

j 1

sample f or g separately and then sum

] [ f

E sample f or g separately and then sum them together

]

[ f g

E

(15)

DirectLighting

enum LightStrategy {

SAMPLE ALL UNIFORM, SAMPLE ONE UNIFORM,_ _ , _ _ , SAMPLE_ONE_WEIGHTED

}; two possible strategies; if there are many image samples for a pixel ( d t d th f fi ld) f l li li ht t (e.g. due to depth of field), we prefer only sampling one light at a time. On the other hand, if there are few image samples, we prefer sampling all lights at once.

class DirectLighting : public SurfaceIntegrator { public:

i l d th p

DirectLighting(LightStrategy ls, int md);

...

maximal depth

}

(16)

RequestSamples

•Different types of lights require different number of samples, usually 2D samples.

samples, usually 2D samples.

•Sampling BRDF requires 2D samples.

•Selection of BRDF components requires 1D samples.

3 1 2

D t D

n1D n2D 2 2 1 1 2 2

sample

allocate together to avoid cache miss filled in by integrators

oneD twoD allocate together to avoid cache miss

mem

bsdfComponent lightSample bsdfSample

integrator

bsdfComponent lightSample bsdfSample

(17)

DirectLighting::RequestSamples

void RequestSamples(Sample *sample, const Scene *scene) { if (strategy == SAMPLE_ALL_UNIFORM) {

u_int nLights = scene->lights.size();

lightSampleOffset = new int[nLights];

bsdfSampleOffset = new int[nLights];

bsdfSampleOffset = new int[nLights];

bsdfComponentOffset = new int[nLights];

for (u_int i = 0; i < nLights; ++i) { const Light *light = scene->lights[i];

int lightSamples

= scene->sampler->RoundSize(light->nSamples);

gives sampler a chance to adjust to an appropriate value

p ( g p );

lightSampleOffset[i] = sample->Add2D(lightSamples);

bsdfSampleOffset[i] = sample->Add2D(lightSamples);

b dfC tOff t[i] l >Add1D(li htS l ) bsdfComponentOffset[i] = sample->Add1D(lightSamples);

}

lightNumOffset = -1;

}

(18)

DirectLighting::RequestSamples

else {

lightSampleOffset = new int[1];

bsdfSampleOffset = new int[1];

bsdfSampleOffset = new int[1];

bsdfComponentOffset = new int[1];

lightSampleOffset[0] = sample->Add2D(1);

lightSampleOffset[0] = sample->Add2D(1);

bsdfSampleOffset[0] = sample->Add2D(1);

bsdfComponentOffset[0] = sample->Add1D(1);

lightNumOffset = sample->Add1D(1);

}

} which light to sample }

(19)

DirectLighting::Li

Spectrum DirectLighting::Li(Scene *scene,

RayDifferential &ray, Sample *sample, float *alpha) {

Intersection isect;

Spectrum L(0.);

if (scene->Intersect(ray, &isect)) { // Evaluate BSDF at hit point

BSDF *bsdf = isect.GetBSDF(ray);( y);

Vector wo = -ray.d;

const Point &p = bsdf->dgShading.p;

const Normal &n = bsdf->dgShading nn;

const Normal &n = bsdf >dgShading.nn;

<Compute emitted light; see next slide>

}

else { else {

// handle ray with no intersection }

return L;

}

(20)

DirectLighting::Li

i i

i d

i o

o e

o

o

p L p f p L p d

L ( ,  )  ( ,  )  

( ,  ,  ) ( ,  ) | cos  | 

L += isect.Le(wo);

if (scene->lights.size() > 0) {( g () ) { switch (strategy) {

case SAMPLE_ALL_UNIFORM:

L + U if S l AllLi ht ( b df

L += UniformSampleAllLights(scene, p, n, wo, bsdf, sample, lightSampleOffset, bsdfSampleOffset, bsdfComponentOffset);

break;

case SAMPLE_ONE_UNIFORM:

L += UniformSampleOneLight(scene p n wo bsdf L += UniformSampleOneLight(scene, p, n, wo, bsdf,

sample, lightSampleOffset[0], lightNumOffset, bsdfSampleOffset[0], bsdfComponentOffset[0]);

break;

(21)

DirectLighting::Li

case SAMPLE_ONE_WEIGHTED:

L += WeightedSampleOneLight(scene, p, n, wo, bsdf, sample according to power

sample, lightSampleOffset[0], lightNumOffset,

bsdfSampleOffset[0], bsdfComponentOffset[0], avgY, avgYsample, cdf, overallAvgY);

break;

} }

if (rayDepth++ < maxDepth) {

// add specular reflected and transmitted contributions }This part is essentially the same as Whitted integrator

}This part is essentially the same as Whitted integrator.

The main difference between Whitted and DirectLighting is the way they sample lights. Whitted uses sample L to take one sample for each light.

sample lights. Whitted uses sample_L to take one sample for each light.

DirectLighting uses multiple Importance sampling to sample both lights and BRDFs.

(22)

Whitted::Li

...

// Add contribution of each light source Vector wi;

for (i = 0; i < scene->lights.size(); ++i) {

{

VisibilityTester visibility;

Spectrum Li = scene->lights[i]->

Sample_L(p, &wi, &visibility);

if (Li.Black()) continue;

Spectrum f = bsdf->f(wo, wi);

Spectrum f bsdf >f(wo, wi);

if (!f.Black() &&

visibility.Unoccluded(scene)) L += f * Li * AbsDot(wi, n) *

visibility.Transmittance(scene);

} } ...

(23)

UniformSampleAllLights

Spectrum UniformSampleAllLights(...) {

Spectrum L(0.);

for (u_int i=0;i<scene->lights.size();++i) { Light *light = scene->lights[i];

int nSamples =

(sample && lightSampleOffset) ?

sample->n2D[lightSampleOffset[i]] : 1;p [ g p [ ]] ; Spectrum Ld(0.);

for (int j = 0; j < nSamples; ++j) Ld += EstimateDirect( );

Ld += EstimateDirect(...);

L += Ld / nSamples;

}

return L;

compute contribution for one sample for one light

return L;

}

) (

| cos

| ) ,

( )

, ,

(

j

j j

d j o

p

p L

p f

)

(

j

p

(24)

UniformSampleOneLight

Spectrum UniformSampleOneLight (...) {

int nLights = int(scene->lights.size());

int lightNum;

if (lightNumOffset != -1) lightNum =

Floor2Int(sample->oneD[lightNumOffset][0]*nLights);

else

lightNum = Floor2Int(RandomFloat() * nLights);

lightNum = min(lightNum, nLights-1);

Light *light = scene->lights[lightNum];

Light light = scene >lights[lightNum];

return (float)nLights * EstimateDirect(...);

}

(25)

Multiple importance sampling

f

f ( ) ( ) ( ) ( ) ( ) ( )

g

f n

j g j

i g j

j g

n

i f i

i f

i i

f

p Y

Y w

Y g Y

f n

X p

X w

X g X

f

n

1 1

( )

) ( )

( ) 1 (

) (

) (

) (

) 1 (

j g j

g i

f f

 

 

 

s s

s

x p x n

w

)

) (

( 

i

i i

s

( ) n p ( x )

(26)

EstimateDirect

Spectrum EstimateDirect(Scene *scene, Light *light, Point

&p, Normal &n, Vector &wo, BSDF *bsdf, Sample *sample, int lightSamp int bsdfSamp int bsdfComponent

int lightSamp, int bsdfSamp, int bsdfComponent, u_int sampleNum)

{

Spectrum Ld(0 );

( )

| cos

| ) ,

( )

, ,

(

o j d j j

p

p L

p f

Spectrum Ld(0.);

float ls1, ls2, bs1, bs2, bcs;

if (lightSamp != -1 && bsdfSamp != -1 &&

sampleNum < sample->n2D[lightSamp] &&

) (

j

p

sampleNum < sample >n2D[lightSamp] &&

sampleNum < sample->n2D[bsdfSamp]) {

ls1 = sample->twoD[lightSamp][2*sampleNum];

ls2 = sample->twoD[lightSamp][2*sampleNum+1];p [ g p][ p ];

bs1 = sample->twoD[bsdfSamp][2*sampleNum];

bs2 = sample->twoD[bsdfSamp][2*sampleNum+1];

bcs = sample->oneD[bsdfComponent][sampleNum];p p p } else {

ls1 = RandomFloat();

ls2 = RandomFloat();

...

}

(27)

Sample light with MIS

Spectrum Li = light->Sample_L(p, n, ls1, ls2, &wi,

&lightPdf, &visibility);

if (lightPdf > 0. && !Li.Black()) { Spectrum f = bsdf->f(wo, wi);

if (!f Black() && visibility Unoccluded(scene)) { if (!f.Black() && visibility.Unoccluded(scene)) {

Li *= visibility.Transmittance(scene);

if (light->IsDeltaLight())

Ld += f * Li * AbsDot(wi, n) / lightPdf;

else {

bsdfPdf = bsdf->Pdf(wo, wi);( , );

float weight = PowerHeuristic(1,lightPdf,1,bsdfPdf);

Ld += f * Li * AbsDot(wi, n) * weight / lightPdf;

} } } }

) (

) (

| cos

| ) ,

( )

, ,

( p

o j

L

d

p

j j

w

L j

f     

)

(

j

p

(28)

Sample BRDF with MIS

if (!light->IsDeltaLight()) {

BxDFType flags = BxDFType(BSDF_ALL & ~BSDF_SPECULAR);

Spectrum f = bsdf >Sample f(wo &wi bs1 bs2 bcs

Only for non-delta light and BSDF Spectrum f = bsdf->Sample_f(wo, &wi, bs1, bs2, bcs,

&bsdfPdf, flags);

if (!f.Black() && bsdfPdf > 0.) { lightPdf = light->Pdf(p n wi);

lightPdf = light->Pdf(p, n, wi);

if (lightPdf > 0.) {

// Add light contribution from BSDF sampling

float weight = PowerHeuristic(1,bsdfPdf,1,lightPdf);

float weight PowerHeuristic(1,bsdfPdf,1,lightPdf);

Spectrum Li(0.f);

RayDifferential ray(p, wi);

if (scene->Intersect(ray, &lightIsect)) {( ( y, g )) {

if (lightIsect.primitive->GetAreaLight() == light) Li = lightIsect.Le(-wi);

} else Li = light->Le(ray);g y for infinite area light if (!Li.Black()) {

Li *= scene->Transmittance(ray);

Ld += f * Li * AbsDot(wi, n) * weight / bsdfPdf;

f f g

} } }

(29)

Direct lighting

(30)

The light transport equation

• The goal of integrator is to numerically solve the light transport equation governing the the light transport equation, governing the

equilibrium distribution of radiance in a scene.

(31)

The light transport equation

(32)

Analytic solution to the LTE

• In general, it is impossible to find an analytic solution to the LTE because of complex BRDF solution to the LTE because of complex BRDF, arbitrary scene geometry and intricate visibility.

F t l i l i id

• For an extremely simple scene, e.g. inside a uniformly emitting Lambertian sphere, it is h ibl Thi i f l f d b i

however possible. This is useful for debugging.

• Radiance should be the same for all points

• Radiance should be the same for all points

L c

L

L

e

 

(33)

Analytic solution to the LTE L

c L

L

e

 

hh e

L L

L

L L

L

) (

e hh

e hh

e

hh e

hh e

L L

L

L L

L

...

( (

) (

i hh

e hh

e hh

e

L

( (

hh i

L

e

0

L

hh

L

e

L   

1 

hh

 1

(34)

Surface form of the LTE

(35)

Surface form of the LTE

These two forms are equivalent, but they represent two

different ways of approaching light transport.

(36)

Surface form of the LTE

(37)

Surface form of the LTE

(38)

Surface form of the LTE

(39)

Delta distribution

(40)

Partition the integrand

(41)

Partition the integrand

(42)

Partition the integrand

(43)

Rendering operators

(44)

Solving the rendering equation

(45)

Successive approximation

(46)

Successive approximation

(47)

Light Transport Notation (Hekbert 1990

)

• Regular expression denoting sequence of events along a light path alphabet: {L E S D G}

along a light path alphabet: {L,E,S,D,G}

– L a light source (emitter) E h

– E the eye

– S specular reflection/transmission – D diffuse reflection/transmission – G glossy reflection/transmission G glossy reflection/transmission

• operators:

(k) f k

– (k)

+

one or more of k

– (k)

*

zero or more of k (iteration)

– (k|k’) a k or a k’ event

(48)

Light Transport Notation: Examples

• LSD

th t ti t li ht h i l – a path starting at a light, having one specular

reflection and ending at a diffuse reflection

D L

S

(49)

Light Transport Notation: Examples

• L(S|D)

+

DE

th t ti t li ht h i diff

– a path starting at a light, having one or more diffuse or specular reflections, then a final diffuse

reflection toward the eye reflection toward the eye

E

D L

D

S

(50)

Light Transport Notation: Examples

• L(S|D)

+

DE

th t ti t li ht h i diff

– a path starting at a light, having one or more diffuse or specular reflections, then a final diffuse

reflection toward the eye reflection toward the eye

E

L D E

S

D S

(51)
(52)
(53)

Rendering algorithms

• Ray casting: E(D|G)L Whi d E[S*](D|G)L

• Whitted: E[S*](D|G)L

• Kajiya: E[(D|G|S)

+

(D|G)]L

• Goral: ED*L

(54)

The rendering equation

(55)

The rendering equation

(56)

The radiosity equation

(57)

Radiosity

• formulate the basic radiosity equation:

N

B

m

 E

m

 

m

B

n

F

mn

n1

N

• B

m

= radiosity = total energy leaving surface m (energy/unit area/unit time)

n 1

(energy/unit area/unit time)

• E

m

= energy emitted from surface m (energy/unit area/unit time)

area/unit time)

• 

m

= reflectivity, fraction of incident light reflected back into environment

• F

mn

= form factor, fraction of energy leaving surface n that lands on surface m

• (A

m

= area of surface m)

(58)

Radiosity

• Bring all the B’s on one side of the equation

E

m

 B

m

 

m

B

n

F

mn

m

• this leads to this equation system:

m

 1  FFF B E

 

 

 

 

 

 

 

N N

E E B

B F

F F

F F

F

2 1 2

1 2

2 22

2 21

2

1 1 12

1 11

1

...

1

...

1

 

 

 

 

 

 

 

 

 

 

  FFF B E

1 

  

N

F

N1

 

N

F

N2

... 1  

N

F

NN

B

N

E

N

E B

SBE

S  

(59)

Path tracing

• Proposed by Kajiya in his classic SIGGRAPH 1986 paper rendering equation as the solution for

paper, rendering equation, as the solution for

• Incrementally generates path of scattering y g p g

events starting from the camera and ending at light sources in the scene. g

• Two questions to answer

– How to do it in finite time? How to do it in finite time?

– How to generate one or more paths to compute

(60)

Infinite sum

• In general, the longer the path, the less the impact

impact.

• Use Russian Roulette after a finite number of bbounces

– Always compute the first few terms

– Stop after that with probability q

(61)

Infinite sum

• Take this idea further and instead randomly

consider terminating evaluation of the sum at

consider terminating evaluation of the sum at

each term with probability q

i

(62)

Path generation (first trial)

• First, pick up surface i in the scene randomly and uniformly A

and uniformly

 

j j

i

i

A

p A

• Then, pick up a point on this surface randomly and uniformly with probability

A

i

1

• Overall probability of picking a random surface point in the scene:

A

i

p

i

i

A

A A A

p A

p 1 1

)

( 

j j i

j j

i

A

p A A A

p ( )

(63)

Path generation (first trial)

• This is repeated for each point on the path.

L i h ld b l d li h

• Last point should be sampled on light sources only.

• If we know characteristics about the scene (such as which objects are contributing most indirect lighting to the scene), we can sample more smartly.

• Problems:

– High variance: only few points are mutually visible, g y p y , i.e. many of the paths yield zero.

– Incorrect integral: for delta distributions, we rarely g y

find the right path direction

(64)

Incremental path generation

• For path

A h fi d di BSDF

i j

j

i

p p p p p

p

0 1

...

1

...

– At each p

j

, find p

j+1

according to BSDF – At p

i-1

, find p

i

by multiple importance

sampling of BSDF and L

• This algorithm distributes samples according to g p g solid angle instead of area. So, the distribution p

A

needs to be adjusted

p

A

j

) (

2 1 i

i

p

p p p

p

 | cos | )

(

i i

A

p p

p

(65)

Incremental path generation

• Monte Carlo estimator

• Implementation re-uses path for new path p

i 1

p

i

• Implementation re uses path for new path This introduces correlation, but speed makes up for it

p

i

1

p

i

for it.

(66)

Path tracing

(67)

Direct lighting

(68)

Path tracing

8 samples per pixel

(69)

Path tracing

1024 samples per pixel

(70)

Bidirectional path tracing

• Compose one path from two paths

t t d t th d

p

–p

1

p

2

…p

i

started at the camera p

0

and –q

j

q

j-1

…q

1

started at the light source q

0

• Modification for efficiency:

1 1

2

1

p ... p , q q ... q p

p

i

i j j

y

–Use all paths whose lengths ranging from lengths ranging from 2 to i+j

H l f l f th it ti i hi h li ht diffi lt

Helpful for the situations in which lights are difficult

to reach and caustics

(71)

Bidirectional path tracing

(72)

Noise reduction/removal

• More samples (slow convergence)

B li ( ifi d i )

• Better sampling (stratified, importance etc.)

• Filtering

• Caching and interpolation

(73)

Biased approaches

• By introducing bias (making smoothness

assumptions) biased methods produce images assumptions), biased methods produce images without high-frequency noise

U lik bi d th d t b

• Unlike unbiased methods, errors may not be reduced by adding samples in biased methods

• On contrast, when there is little error in the

result of an unbiased method, we are confident that it is close to the right answer

• Three biased approaches pp

– Filtering

– Irradiance caching Irradiance caching

– Photon mapping

(74)

The world is more diffuse!

(75)

Filtering

• Noise is high frequency M h d

• Methods:

– Simple filters

– Anisotropic filters

– Energy preserving filters

• Problems with filtering: everything is filtered

(blurred)

(76)

Path tracing (10 paths/pixel)

(77)

3x3 lowpass filter

(78)

3x3 median filter

(79)

Caching techniques

• Irradiance caching: compute irradiance at selected points and interpolate

selected points and interpolate

• Photon mapping: trace photons from the lights

d t th i h t th t b

and store them in a photon map, that can be

used during rendering

(80)

Direct illumination

(81)

Global illumination

(82)

Indirect irradiance

Indirect illumination tends to be low frequency

(83)

Irradiance caching

• Introduced by Greg Ward 1988

I l d i R di d

• Implemented in Radiance renderer

• Contributions from indirect lighting often vary

smoothly →cache and interpolate results

(84)

Irradiance caching

• Compute indirect lighting at sparse set of samples

samples

• Interpolate neighboring values from this set of l

samples

• Issues

– How is the indirect lighting represented

– How to come up with such a sparse set of samples?

– How to store these samples?

– When and how to interpolate? p

(85)

Set of samples

• Indirect lighting is computed on demand, store irradiance in a spatial data structure If there is irradiance in a spatial data structure. If there is no good nearby samples, then compute a new irradiance sample

irradiance sample

• Irradiance (radiance is direction dependent, i t t )

expensive to store)

H

L

i

p

i i

d

i

p

E ( )

2

( ,) | cos|

• If the surface is Lambertian, 

H

| cos

| ) (

) (

)

( p f p L p d

L        

| cos

| ) , (

| cos

| ) , ( ) , ,

( )

, (

2 2

d p

L

d p

L p

f p

L

H i i i i

H o i i i i i

o o

) (

E p

H

(86)

Set of samples

• For diffuse scenes, irradiance alone is enough information for accurate computation

information for accurate computation

• For nearly diffuse surfaces (such as Oren-Nayar

l f ith id l

or a glossy surface with a very wide specular lobe), we can view irradiance caching makes th f ll i i ti

the following approximation

( , , )  ( , ) | cos |

) ,

( p f p d L p d

L      

i

i

 

i

i

i

i

( )( )

| cos

| ) , ( )

, , ( )

, (

2 1

2 2

p E

d p

L d

p f p

L

o hd

H i i i i

H o i i

o o

directional reflectance

(87)

Set of samples

• Not a good approximation for specular surfaces l Whi d i

• specular → Whitted integrator

• Diffuse → irradiance caching

– Interpolate from known points – Cosine-weighted

– Path tracing sample points

i i

i

i

p d

L p

E ( p )

H

L

i

( p ,

i

) | cos

i

| d

i

E ( )

2

( ,) | cos|

L

i

p

j j

p

E ( )

| cos

| ) ,

1 ( )

(  

j

p

j

p N

) ) (

( 

L

i

p

j

p N

E ( )  ( ,  ) p (  )  cos  

j

j

N

i

(88)

Storing samples

{E,p,n,d}

d

d

i

(89)

Interpolating from neighbors

• Skip samples

N l t diff t – Normals are too different – Too far away

I f t – In front

• Weight (ad hoc)

2

'

1 1 

 

 

d N N

w

i

d

• Final irradiance estimate is simply the weighted

max

 

  d N  N

sum

 

i

w

i

E

i

E

i

w

i

(90)

IrradianceCache

class IrradianceCache : public SurfaceIntegrator { ...

float maxError; how frequently irradiance samples are computed or interpolated

int nSamples; how many rays for irradiance samples int maxSpecularDepth, maxIndirectDepth;

mutable int specularDepth; current depth for specular mutable int specularDepth; current depth for specular }

(91)

IrradianceCache::Li

L += isect.Le(wo);

L += UniformSampleAllLights(scene, p, n, wo,...);

if (specularDepth++ < maxSpecularDepth) {

<Trace rays for specular reflection and refraction>

}

--specularDepth;

// Estimate indirect lighting with irradiance cache

...

BxDFType flags = BxDFType(BSDF REFLECTION | BxDFType flags = BxDFType(BSDF_REFLECTION |

BSDF_DIFFUSE | BSDF_GLOSSY);

L+=IndirectLo(p, ng, wo, bsdf, flags, sample, scene);

flags = BxDFType(BSDF_TRANSMISSION |

BSDF_DIFFUSE | BSDF_GLOSSY);

di ( b df fl l )

L+=IndirectLo(p, -ng, wo, bsdf, flags, sample,scene);

(92)

IrradianceCache::IndirectLo

if (!InterpolateIrradiance(scene, p, n, &E)) { ... // Compute irradiance at current point

for (int i = 0; i < nSamples; ++i) {

<Path tracing to compute radiances along ray for irradiance sample>

for irradiance sample>

E += L;

float dist = r.maxt * r.d.Length(); // max distance I Di t + 1 f / di t

sumInvDists += 1.f / dist;

}

E *= M_PI / float(nSamples);_

... // Add computed irradiance value to cache

octree->Add(IrradianceSample(E,p,n,nSamples/sumInvDists), sampleExtent);

sampleExtent);

}

return .5f * bsdf->rho(wo, flags) * E;

(93)

Octree

• Constructed at Preprocess()

void IrradianceCache::Preprocess(const Scene *scene)p ( ) {

BBox wb = scene->WorldBound();

V t d lt 01f * ( b M b Mi )

Vector delta = .01f * (wb.pMax - wb.pMin);

wb.pMin -= delta;

wb.pMax += delta;

octree=new Octree<IrradianceSample,IrradProcess>(wb);

}

struct IrradianceSample { Spectrum E;

Normal n;

Point p;

float maxDist;

float maxDist;

};

(94)

InterpolateIrradiance

Bool InterpolateIrradiance(const Scene *scene,

const Point &p, const Normal &n, Spectrum *E) {

if (!octree) return false;

IrradProcess proc(n, maxError);

octree->Lookup(p, proc);

Traverse the octree; for each node where the query point is inside, call a method of proc to process for each irradiacne sample.

if (!proc.Successful()) return false;

*E = proc GetIrradiance();

f p p f p

E = proc.GetIrradiance();

return true;

}

(95)

IrradProcess

void IrradProcess::operator()(const Point &p, const IrradianceSample &sample)

{

// Skip if surface normals are too different if (Dot(n, sample.n) < 0.01f) return;

// Skip if it's too far from the sample point float d2 = DistanceSquared(p, sample.p);

if (d2 > sample.maxDist * sample.maxDist) return;( p p ) ; // Skip if it's in front of point being shaded

Normal navg = sample.n + n;

if (Dot(p - sample p navg) < - 01f) return;

if (Dot(p sample.p, navg) < .01f) return;

// Compute estimate error and possibly use sample

float err=sqrtf(d2)/(sample.maxDist*Dot(n,sample.n));

if (err < 1 ) { if (err < 1.) {

float wt = (1.f - err) * (1.f - err);

E += wt * sample.E; sumWt += wt;

}

1

2

1  

  d

}

}

1

max

' 

 

  

d N N

w

i

(96)

Comparison with same limited time

Irradiance caching Path tracing

Blotch artifacts High-frequency noises h f

(97)

Irradiance caching

Irradiance caching Irradiance sample

positions

(98)

Photon mapping

• It can handle both diffuse and glossy reflection;

specular reflection is handled by recursive ray specular reflection is handled by recursive ray tracing

T t ti l t i l ith

• Two-step particle tracing algorithm

• Photon tracing

– Simulate the transport of individual photons – Photons emitted from source

– Photons deposited on surfaces

– Photons reflected from surfaces to surfaces

• Rendering

– Collect photons for rendering Collect photons for rendering

(99)

Photon tracing

• Preprocess: cast rays from light sources

(100)

Photon tracing

• Preprocess: cast rays from light sources

S h ( i i li h

• Store photons (position + light power +

incoming direction)

(101)

Photon map

• Efficiently store photons for fast access

U hi hi l i l (kd )

• Use hierarchical spatial structure (kd-tree)

(102)

Rendering (final gathering)

• Cast primary rays; for the secondary rays,

reconstruct irradiance using the k closest stored

reconstruct irradiance using the k closest stored

photon

(103)

Rendering (without final gather)

i i

i i

i o o

e o

o

p L p f p L p d

L ( ,  )  ( ,  )  

( ,  ,  ) ( ,  ) | cos  | 

(104)

Rendering (with final gather)

(105)

Photon mapping results

photon map rendering

(106)

Photon mapping - caustics

• Special photon map for specular reflection and f ti

refraction

(107)

Caustics

Path tracing: Photon mapping

1,000 paths/pixel Photon mapping

(108)

PhotonIntegrator

class PhotonIntegrator : public SurfaceIntegrator { int nCausticPhotons,nIndirectPhotons,nDirectPhotons;

int nLookup; number of photons for interpolation (50~100) int specularDepth, maxSpecularDepth;

fl t Di tS d h di t t l t float maxDistSquared; search distance; too large, waste

time; too small, not enough samples bool directWithPhotons, finalGather;

bool directWithPhotons, finalGather;

int gatherSamples;

L ft} Left:

100K photons 50 photons in

radiance estimate radiance estimate Right:

500K h t 500K photons 500 photons in radiance estimate

(109)

Photon map

Kd-tree is used to store photons

store photons,

decoupled from the

scene geometry

(110)

Photon shooting

• Implemented in Preprocess method

Th f h ( i di i di )

• Three types of photons (caustic, direct, indirect)

struct Photon { Point p;

S t l h

Spectrum alpha;

Vector wi;

};

};

w

i

p

α

(111)

Photon shooting

• Use Halton sequence since number of samples is unknown beforehand starting from a sample is unknown beforehand, starting from a sample light with energy . Store photons for non- specular surfaces

( , )

) , (

0 0

0 0

p p

p Le

specular surfaces.

(112)

Photon shooting

void PhotonIntegrator::Preprocess(const Scene *scene) {

{

vector<Photon> causticPhotons;

vector<Photon> directPhotons;

vector<Photon> indirectPhotons;

while (!causticDone || !directDone|| !indirectDone) {

{

++nshot;

<trace a photon path and store contribution>p p }

}

(113)

Photon shooting

Spectrum alpha = light->Sample_L(scene, u[0], u[1], u[2], u[3], &photonRay, &pdf);[ ], [ ], p y, p );

alpha /= pdf * lightPdf;

While (scene->Intersect(photonRay, &photonIsect)) { alpha *= scene->Transmittance(photonRay);

<record photon depending on type>

<sample next direction>

<sample next direction>

Spectrum fr = photonBSDF->Sample_f(wo, &wi, u1, u2, u3, &pdf, BSDF ALL, &flags);, p , _ , g )

alpha*=fr*AbsDot(wi, photonBSDF->dgShading.nn)/ pdf;

photonRay = RayDifferential(photonIsect.dg.p, wi);

if (nIntersections > 3) {

if (RandomFloat() > .5f) break;

alpha /= 5f;

alpha /= .5f;

} }

參考文獻

相關文件

requires to be able to evaluate the integrand at arbitrary points making it easy to implement.. arbitrary points, making it easy to implement and applicable to

• Three uniform random numbers are used, the first one determines which BxDF to be sampled and then sample that BxDF using the other two random numbers.. Sampling

• Metropolis sampling can efficiently generate a set of samples from any non negative function f set of samples from any non-negative function f requiring only the ability to

• Photon mapping: trace photons from the lights d t th i h t th t b and store them in a photon map, that can be used during rendering.. Direct illumination

• Three uniform random numbers are used, the first one determines which BxDF to be sampled first one determines which BxDF to be sampled (uniformly sampled) and then sample that

Operating mode After SCAN_N has been selected as the current instruction, when in SHIFT-DR state, the scan chain select register is selected as the serial path between TDI and

• Non-uniform space subdivision (for example, kd tree and octree) is better than uniform grid kd-tree and octree) is better than uniform grid if the scene is

• The scene with depth variations and the camera has movement... Planar scene (or a