• 沒有找到結果。

Computational Photography

N/A
N/A
Protected

Academic year: 2022

Share "Computational Photography"

Copied!
203
0
0

加載中.... (立即查看全文)

全文

(1)

Computational Photography

Digital Visual Effects, Spring 2007 Yung-Yu Chuang

2007/5/22

with slides by Fredo Durand, Ramesh Raskar, Sylvain Paris, Soonmin Bae

(2)

Computational photography

wikipedia:

Computational photography refers broadly to computational imaging techniques that enhance or extend the capabilities of digital photography.

The output of these techniques is an ordinary photograph, but one that could not have been taken by a traditional camera.

(3)

What is computational photography

• Convergence of image processing, computer vision, computer graphics and photography

• Digital photography:

Simply mimics traditional sensors and recording by digital technology

Involves only simple image processing

• Computational photography

More elaborate image manipulation, more computation

New types of media (panorama, 3D, etc.)

Camera design that take computation into account

(4)

Computational photography

• One of the most exciting fields.

• Symposium on Computational Photography and Video, 2005

• Full-semester courses in MIT, CMU, Stanford, GaTech, University of Delaware

• A new book by Raskar and Tumblin is coming out in SIGGRAPH 2007.

(5)

Siggraph 2006 Papers (16/86=18.6%)

Hybrid Images

Drag-and-Drop Pasting

Two-scale Tone Management for Photographic Look Interactive Local Adjustment of Tonal Values

Image-Based Material Editing Flash Matting

Natural Video Matting using Camera Arrays

Removing Camera Shake From a Single Photograph Coded Exposure Photography: Motion Deblurring Photo Tourism: Exploring Photo Collections in 3D AutoCollage

Photographing Long Scenes With Multi-Viewpoint Panoramas Projection Defocus Analysis for Scene Capture and Image Display Multiview Radial Catadioptric Imaging for Scene Capture

Light Field Microscopy

Fast Separation of Direct and Global Components of a Scene Using High Frequency Illumination

(6)

Siggraph 2007 Papers (23/108=21.3%)

Image Deblurring with Blurred/Noisy Image Pairs Photo Clip Art

Scene Completion Using Millions of Photographs

Soft Scissors: An Interactive Tool for Realtime High Quality Matting Seam Carving for Content-Aware Image Resizing

Detail-Preserving Shape Deformation in Image Editing Veiling Glare in High Dynamic Range Imaging

Do HDR Displays Support LDR content? A Psychophysical Evaluation

Ldr2hdr: On-the-fly Reverse Tone Mapping of Legacy Video and Photographs Rendering for an Interactive 360-Degree Light Field Display

Multiscale Shape and Detail Enhancement from Multi-light Image Collections Post-Production Facial Performance Relighting Using Reflectance Transfer Active Refocusing of Images and Videos

Multi-aperture Photography

Dappled Photography: Mask-Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Image and Depth from a Conventional Camera with a Coded Aperture Capturing and Viewing Gigapixel Images

Efficient Gradient-Domain Compositing Using Quadtrees Image Upsampling via Imposed Edges Statistics

Joint Bilateral Upsampling Factored Time-Lapse Video

Computational Time-Lapse Video

Real-Time Edge-Aware Image Processing With the Bilateral Grid

(7)

Scope

• We can’t yet set its precise definition. The following are scopes of what researchers are exploring in this field.

– Record a richer visual experience

– Overcome long-standing limitations of conventional cameras

– Enable new classes of visual signal – Enable synthesis impossible photos

(8)

Scope

Image formation

Color and color perception

Demosaicing

(9)

Scope

Panoramic imaging

Image and video registration

Spatial warping operations

(10)

Scope

High Dynamic Range Imaging

Bilateral

filtering and HDR display

Matting

(11)

Scope

Active flash methods

Lens technology

Depth and defocus

No-flash

Flash

our result

(12)

Removing Photography Artifacts using Gradient Projection and Flash-Exposure Sampling

(13)

Continuous flash

Flash = 0.0

Flash = 0.3 Flash = 0.7 Flash = 1.4

Flash = 1.0

(14)

Flash matting

(15)

Depth Edge Detection and Stylized

Rendering Using a Multi-Flash Camera

(16)

Motion-Based Motion Deblurring

(17)

Removing Camera Shake from a

Single Photograph

(18)

Motion Deblurring using Fluttered Shutter

(19)

Scope

Future cameras

Plenoptic function and light fields

(20)

Scope

Gradient image manipulation

(21)

Scope

Taking great pictures

Art Wolfe Ansel Adams

(22)

Scope

• Non-parametric image synthesis, inpainting,

analogies

(23)

Scope

Motion analysis

(24)

Image Inpainting

(25)

Object Removal by

Exemplar-Based Inpainting

(26)

Image Completion with

Structure Propagation

(27)

Lazy snapping

(28)

Lazy snapping

• Pre-segmentation

• Boundary Editing

(29)

Grab Cut - Interactive Foreground

Extraction using Iterated Graph Cuts

(30)

Image Tools

• Gradient domain operations,

Tone mapping, fusion and matting

• Graph cuts,

Segmentation and mosaicing

• Bilateral and Trilateral filters,

Denoising, image enhancement

(31)

Gradient domain operators

(32)

Intensity Gradient in 1D

I(x) 1

105

G(x) 1

105

Intensity Gradient

Gradient at x,

G(x) = I(x+1)- I(x)

Forward Difference

(33)

Reconstruction from Gradients

I(x) 1

105

Intensity

G(x) 1

105

Gradient

? ?

For n intensity values, about n gradients

(34)

Reconstruction from Gradients

I(x) 1

105

Intensity

G(x) 1

105

Gradient

1D Integration

I(x) = I(x-1) + G(x) Cumulative sum

?

(35)

1D case with constraints

Seamlessly paste onto

Just add a linear function so that the boundary condition is respected

(36)

Discrete 1D example: minimization

• Copy to

• Min ((f2-f1)-1)2

• Min ((f3-f2)-(-1))2

• Min ((f4-f3)-2)2

• Min ((f5-f4)-(-1))2

• Min ((f6-f5)-(-1))2

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

? ? ? ?

With

f

1

=6

f

6

=1

(37)

1D example: minimization

• Copy to

• Min ((f2-6)-1)2 ==> f22+49-14f2

• Min ((f3-f2)-(-1))2 ==> f32+f22+1-2f3f2 +2f3-2f2

• Min ((f4-f3)-2)2 ==> f42+f32+4-2f3f4 -4f4+4f3

• Min ((f5-f4)-(-1))2 ==> f52+f42+1-2f5f4 +2f5-2f4

• Min ((1-f5)-(-1))2 ==> f52+4-4f5

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

? ? ? ?

(38)

1D example: big quadratic

Copy to

Min (f22+49-14f2

+ f32+f22+1-2f3f2 +2f3-2f2 + f42+f32+4-2f3f4 -4f4+4f3 + f52+f42+1-2f5f4 +2f5-2f4 + f52+4-4f5)

Denote it Q

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

? ? ? ?

(39)

1D example: derivatives

Copy to

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

? ? ? ?

Min (f22+49-14f2

+ f32+f22+1-2f3f2 +2f3-2f2 + f42+f32+4-2f3f4 -4f4+4f3 + f52+f42+1-2f5f4 +2f5-2f4 + f52+4-4f5)

Denote it Q

(40)

1D example: set derivatives to zero

Copy to

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

? ? ? ?

==>

(41)

1D example

• Copy to

0 1

2 3 4 5 6

0

1 2 3 4 5 6 7 -1

-1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

(42)

1D example: remarks

Copy to

Matrix is sparse

Matrix is symmetric

Everything is a multiple of 2

because square and derivative of square

Matrix is a convolution (kernel -2 4 -2)

Matrix is independent of gradient field. Only RHS is

Matrix is a second derivative

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

(43)

Grad X

Grad Y

Intensity Gradient in 2D

Gradient at x,y as Forward Differences Gx(x,y) = I(x+1 , y)- I(x,y)

Gy(x,y) = I(x , y+1)- I(x,y) G(x,y) = (Gx , Gy)

(44)

Grad X

Grad Y

Integration2D

Image Intensity Gradients in 2D

Sanity Check:

Recovering Original Image

Solve

Poisson Equation, 2D linear system

(45)

Grad X

Grad Y

New Grad X

New Grad Y

Integration2D

Intensity Gradient Manipulation

Gradient Processing

A Common Pipeline

Modify Gradients

(46)

2D case with constraints

Given vector field v (pasted gradient), find the value of f in unknown region that optimize:

Pasted gradient Mask

Background unknown

region

(47)

Poisson image editing

(48)

Problems with direct cloning

From Perez et al. 2003

(49)

Solution: clone gradient

(50)

Result

(51)
(52)
(53)
(54)
(55)

Reduce big gradients

• Dynamic range compression

• Fattal et al. 2002

(56)

Seamless Image Stitching in the Gradient Domain

Anat Levin, Assaf Zomet, Shmuel Peleg, and Yair Weiss

http://www.cs.huji.ac.il/~alevin/papers/eccv04-blending.pdf http://eprints.pascal-network.org/archive/00001062/01/tips05- blending.pdf

• Various strategies (optimal cut, feathering)

(57)

Gradient tone mapping

• Fattal et al. Siggraph 2002

Slide from Siggraph 2005 by Raskar (Graphs by Fattal et al.)

(58)

Gradient attenuation

From Fattal et al.

(59)

Fattal et al. Gradient tone mapping

(60)

Poisson Matting

Sun et al. Siggraph 2004

Assume gradient of F & B is negligible

Plus various image-editing tools to refine matte

(61)

Interactive Local Adjustment of Tonal Values

Dani Lischinski, Zeev Farbman The Hebrew University

Matt Uyttendaele, Richard Szeliski Microsoft Research

(62)

Background (1)

Dodging

Burning brushes

Darkroom

Camera shutter ---Æ Photograph

⎧⎨

Tool

Only!

But, …

It is tedious, time-consuming and painstaking!

(63)

Background (2)

• A large arsenal of adjustment tools

• Hard to master these tools

– To learn, use

• Tedious and time-consuming

– Professional ability, experienced skill – Too many layer masks

• Incapable in some requirements [Adobe Photoshop CS2, 2005]

(64)

Background (2)

[Adobe Photoshop CS2, 2005]

Result Layer mask

Original image

(65)

Related Work: Tone Mapping Operators

• Global operators

[Ward Larson et al. 1997; Reinhard et al. 2002; Drago et al. 2003]

– Usually fast

• Local operators

[Fattal et al. 2002; Reinhard et al. 2002;

Li et al. 2005] …

– Better at preserving local contrasts – Introduce visual artifacts sometimes

(66)

Limitations of Tone Mapping Operators

• Lack of direct local control

– Can’t directly manipulate a particular region

• Not guaranteed to converge to a subjectively satisfactory result

– Involves several trial-and-error iterations – Change the entire image each iteration

(67)
(68)
(69)
(70)

Algorithm Overview

1.Load a digital negative, a camera RAW file, an HDR radiance map, or an ordinary image

2.Indicate regions in the image that require adjusting

3.Experiment with the available adjustment

parameters until a satisfactory result is obtained in the desired regions

4.Iterate 2 and 3 until a satisfactory image

(71)
(72)
(73)
(74)
(75)
(76)
(77)
(78)
(79)
(80)

An Example

(81)

Region Selection: Strokes and Brushes

• Basic brush

• Luminance brush

weight=1, for the selected pixels in the brush;

weight=0, else

(82)

Region Selection: Luminance Brush

(83)

Region Selection: Strokes and Brushes

• Basic brush

• Luminance brush

• Lumachrome brush (chromaticity) –

• Over-exposure brush

• Under-exposure brush

(84)

Constraint Propagation

User strokes Adjusted exposure

(85)

Image-guided Energy Minimization

Data term + smoothing term

(86)

Image-guided Energy Minimization

data term + smoothing term

: log-luminance channel : sensitivity factor

: a small zero-division constant : a balance factor

Default:

(87)

Standard Finite Differences

(88)

Fast Approximate Solution

Solved iteratively by [Saad 2003]

preconditioned conjugate gradients (PCG)

(89)

Interactive Local Adjustment of Tonal Value

arg min ( )( ( ) ( ))2 ( , ) f = f w f g + λ h ∇ ∇f L

∑ ∑

X X

x x x

f

(90)

Results

(91)

Graph cut

(92)

Graph cut

• Interactive image segmentation using graph cut

• Binary label: foreground vs. background

• User labels some pixels

similar to trimap, usually sparser

• Exploit

Statistics of known Fg & Bg Smoothness of label

• Turn into discrete graph optimization

Graph cut (min cut / max flow)

F

B F

F F F B

B B

(93)

Energy function

Labeling: one value per pixel, F or B

Energy(labeling) = data + smoothness

Very general situation Will be minimized

Data: for each pixel

Probability that this color belongs to F (resp. B) Similar in spirit to Bayesian matting

Smoothness (aka regularization):

per neighboring pixel pair

Penalty for having different label Penalty is downweighted if the two

pixel colors are very different Similar in spirit to bilateral filter

One labeling (ok, not best)

Data

Smoothness

(94)

Data term

• A.k.a regional term

(because integrated over full region)

• D(L)=

Σ

i -log h[Li](Ci)

Where i is a pixel

Li is the label at i (F or B), Ci is the pixel value

h[Li] is the histogram of the observed Fg (resp Bg)

• Note the minus sign

(95)

Hard constraints

• The user has provided some labels

• The quick and dirty way to include

constraints into optimization is to replace the data term by a huge penalty if not respected.

• D(L_i)=0 if respected

• D(L_i) = K if not respected

e.g. K=- #pixels

(96)

Smoothness term

• a.k.a boundary term, a.k.a. regularization

• S(L)=

Σ

{j, i} in N B(Ci,Cj) δ(Li-Lj)

• Where i,j are neighbors

e.g. 8-neighborhood

(but I show 4 for simplicity)

• δ(Li-Lj) is 0 if Li=Lj, 1 otherwise

• B(Ci,Cj) is high when Ci and Cj are similar, low if there is a discontinuity between those two pixels

e.g. exp(-||Ci-Cj||2/2σ2) where σ can be a constant

or the local variance

• Note positive sign

(97)

Optimization

• E(L)=D(L)+λ S(L)

• λ is a black-magic constant

• Find the labeling that minimizes E

• In this case, how many possibilities?

29 (512)

We can try them all!

What about megapixel images?

(98)

Labeling as a graph problem

• Each pixel = node

• Add two nodes F & B

• Labeling: link each pixel to either F or B

F

B

Desired result

(99)

Data term

• Put one edge between each pixel and F & G

• Weight of edge = minus data term

Don’t forget huge weight for hard constraints Careful with sign

B

F

(100)

Smoothness term

• Add an edge between each neighbor pair

• Weight = smoothness term

B

F

(101)

Min cut

• Energy optimization equivalent to min cut

• Cut: remove edges to disconnect F from B

• Minimum: minimize sum of cut edge weight

B

F

cut

(102)

Min cut <=> labeling

• In order to be a cut:

For each pixel, either the F or G edge has to be cut

• In order to be minimal

Only one edge label per pixel can be cut (otherwise could

be added)

B

F

cut

(103)

Computing a multiway cut

• With 2 labels: classical min-cut problem

Solvable by standard flow algorithms

polynomial time in theory, nearly linear in practice

More than 2 terminals: NP-hard

[Dahlhaus et al., STOC ‘92]

• Efficient approximation algorithms exist

Within a factor of 2 of optimal

Computes local minimum in a strong sense

even very large moves will not improve the energy

Yuri Boykov, Olga Veksler and Ramin Zabih, Fast Approximate Energy Minimization via Graph Cuts, International Conference on Computer Vision, September 1999.

(104)

Move examples

Starting point

Red-blue swap move

Green expansion move

(105)

GrabCut GrabCut

Interactive Foreground Extraction Interactive Foreground Extraction

using Iterated Graph Cuts using Iterated Graph Cuts

Carsten

Carsten Rother Rother

Vladimir Kolmogorov Vladimir Kolmogorov

Andrew Blake Andrew Blake

Microsoft Research Cambridge

Microsoft Research Cambridge- - UK UK

(106)

Agrawala et al, Digital Photomontage, Siggraph 2004

(107)
(108)

Source images Brush strokes Computed labeling

Composite

(109)

Brush strokes Computed labeling

Graph Cuts for Segmentation and Mosaicing

(110)

Interactive Digital Photomontage

• Extended depth of field

(111)

Interactive Digital Photomontage

• Relighting

(112)

Interactive Digital Photomontage

(113)

Bilateral filtering

[Ben Weiss, Siggraph 2006]

[Ben Weiss, Siggraph 2006]

Input

Input Log(Intensity) Log(Intensity) Bilateral Smoothing Bilateral Smoothing Gaussian

Gaussian Smoothing Smoothing

(114)

Image Denoising

noisy image naïve denoising Gaussian blur

better denoising edge-preserving filter

Smoothing an image without blurring its edges.

(115)

A Wide Range of Options

• Diffusion, Bayesian, Wavelets…

– All have their pros and cons.

• Bilateral filter

– not always the best result [Buades 05] but often good – easy to understand, adapt and set up

(116)

Start with Gaussian filtering

• Here, input is a step function + noise

output input

=

J f

I

(117)

Start with Gaussian filtering

• Spatial Gaussian f

output input

=

J

f

I

(118)

Start with Gaussian filtering

• Output is blurred

output input

J

= f I

(119)

Gaussian filter as weighted average

• Weight of ξ depends on distance to x

) , (x ξ

f I(ξ )

output input

= ) (x

J

ξ

x x

ξ

(120)

The problem of edges

• Here, “pollutes” our estimate J(x)

• It is too different

x

) ( ξ I

) (x I

) , (x ξ

f I(ξ )

= ) (x

J

ξ

output input

(121)

Principle of Bilateral filtering

[Tomasi and Manduchi 1998]

• Penalty g on the intensity difference

= ) (x

J

f (x,ξ ) g(I(ξ ) − I(x)) I(ξ ) ) ξ

( 1

x k

x I (x )

) ( ξ

I

output input

(122)

Bilateral filtering

[Tomasi and Manduchi 1998]

• Spatial Gaussian f

= ) (x

J

f ( x , ξ )

g(I(ξ ) I(x)) I(ξ )

) ξ

( 1

x k

x

output input

(123)

Bilateral filtering

[Tomasi and Manduchi 1998]

• Spatial Gaussian f

• Gaussian g on the intensity difference

= ) (x

J

f (x,ξ ) g(I(ξ ) − I(x)) I(ξ ) ) ξ

( 1

x k

x

output input

(124)

output input

Normalization factor

[Tomasi and Manduchi 1998]

• k(x)=

= ) (x

J

I(ξ )

) ξ

( 1

x k

x

) , (x ξ

f g(I(ξ ) − I(x))

ξ

) , (x ξ

f g(I(ξ ) − I(x))

(125)

output input

Bilateral filtering is non-linear

[Tomasi and Manduchi 1998]

• The weights are different for each output pixel

= ) (x

J

f (x,ξ ) g(I(ξ ) − I(x)) I(ξ ) ) ξ

( 1

x k

x x

(126)

Many Applications based on Bilateral Filter

Tone Mapping [Durand 02]

Virtual Video Exposure [Bennett 05]

And many others…

Flash / No-Flash [Eisemann 04, Petschnigg 04]

[Petschnigg 04]

Tone Management [Bae 06]

(127)

Advantages of Bilateral Filter

• Easy to understand

– Weighted mean of nearby pixels

• Easy to adapt

– Distance between pixel values

• Easy to set up

– Non-iterative

(128)

But Bilateral Filter is Nonlinear

• Slow but some accelerations exist:

– [Elad 02]: Gauss-Seidel iterations

• Only for many iterations

– [Durand 02, Weiss 06]: fast approximation

• No formal understanding of accuracy versus speed

• [Weiss 06]: Only box function as spatial kernel

(129)

A Fast Approximation of the Bilateral Filter

using a Signal Processing Approach

Sylvain Paris and Frédo Durand

Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology

(130)

Definition of Bilateral Filter

• [Smith 97, Tomasi 98]

• Smoothes an image and preserves edges

• Weighted average of neighbors

• Weights

– Gaussian on space distance – Gaussian on range distance – sum to 1

space range

Input Result

(131)

Contributions

• Link with

linear filtering

Fast

and

accurate

approximation

(132)

Intuition on 1D Signal

BF

(133)

p

Intuition on 1D Signal

Weighted Average of Neighbors

• Near and similar pixels have influence.

• Far pixels have no influence.

• Pixels with different value have no influence.

weights applied to pixels

(134)

p

Link with Linear Filtering

1. Handling the Division

sum of weights

Handling the division with a

projective space

.

(135)

Formalization: Handling the Division

• Normalizing factor as homogeneous coordinate

• Multiply both sides by

• Normalizing factor as homogeneous coordinate

• Multiply both sides by

(136)

Formalization: Handling the Division

• Similar to homogeneous coordinates in projective space

• Division delayed until the end

• Next step: Adding a dimension to make a convolution appear

with Wq=1

(137)

space range p

Link with Linear Filtering

2. Introducing a Convolution

q

space: 1D Gaussian

× range: 1D Gaussian combination: 2D Gaussian space: 1D Gaussian

× range: 1D Gaussian combination: 2D Gaussian

(138)

p

Link with Linear Filtering

2. Introducing a Convolution

q

space: 1D Gaussian

× range: 1D Gaussian combination: 2D Gaussian space: 1D Gaussian

× range: 1D Gaussian combination: 2D Gaussian

space x range

Corresponds to a 3D Gaussian on a 2D image.

(139)

Link with Linear Filtering

2. Introducing a Convolution

space-range Gaussian black = zero

sum all values

sum all values multiplied by kernel Ö convolution

(140)

space-range Gaussian

result of the convolution

Link with Linear Filtering

2. Introducing a Convolution

(141)

Link with Linear Filtering

2. Introducing a Convolution

space-range Gaussian

result of the convolution

(142)

higher dimensional functions

Gaussian convolution

division

slicing

w i w

(143)

Reformulation: Summary

1. Convolution in higher dimension

• expensive but well understood (linear, FFT, etc)

2. Division and slicing

• nonlinear but simple and pixel-wise

Exact reformulation

Exact reformulation

(144)

higher dimensional functions

Gaussian convolution

division

slicing

Low-pass filter Low-pass filter

Almost only low freq.

High freq.

negligible Almost only

low freq.

High freq.

negligible

w i w

(145)

higher dimensional functions

Gaussian convolution

division

slicing

w i w

D O W N S A M P L E

U P S A M P L E

Almost no information loss Almost no information loss

(146)

Fast Convolution by Downsampling

• Downsampling cuts frequencies above Nyquist limit

– Less data to process – But induces error

• Evaluation of the approximation

– Precision versus running time – Visual accuracy

(147)

Accuracy versus Running Time

• Finer sampling increases accuracy.

• More precise than previous work.

finer sampling

PSNR as function of Running Time

Digital photograph 1200 × 1600 Straightforward implementation is over 10 minutes.

(148)

input exact BF our result prev. work

1200 × 1600

• Comparison with previous work [Durand 02]

– running time = 1s for both techniques

0

difference 0.1

with exact computation (intensities in [0:1])

Visual Results

(149)

input exact BF our result prev. work

1200 × 1600

• Comparison with previous work [Durand 02]

– running time = 1s for both techniques

0

difference 0.1

with exact computation (intensities in [0:1])

Visual Results

(150)

input our result exact BF prev. work

1200 × 1600

• Comparison with previous work [Durand 02]

– running time = 1s for both techniques

0

difference 0.1

with exact computation (intensities in [0:1])

Visual Results

(151)

input exact BF our result

difference with exact computation (intensities in [0:1])

prev. work

1200 × 1600

• Comparison with previous work [Durand 02]

– running time = 1s for both techniques

0 0.1

Visual Results

(152)

input prev. work our result exact BF

1200 × 1600

• Comparison with previous work [Durand 02]

– running time = 1s for both techniques

0

difference 0.1

with exact computation (intensities in [0:1])

Visual Results

(153)

Discussion

• Higher dimension Ö advantageous formulation

– akin to Level Sets with topology – our approach: isolate nonlinearities

– dimension increase largely offset by downsampling

• Space-range domain already appeared

– [Sochen 98, Barash 02]: image as an embedded manifold – new in our approach: image as a dense function

(154)

Conclusions

Practical gain

• Interactive running time

• Visually similar results

• Simple to code (100 lines)

Theoretical gain

• Link with linear filters

• Separation linear/nonlinear

• Signal processing framework

higher dimension Ö “better” computation

higher dimension Ö “better” computation

(155)

Two-scale Tone Management for Photographic Look

Soonmin Bae, Sylvain Paris, and Frédo Durand MIT CSAIL

(156)

Ansel Adams

Ansel Adams, Clearing Winter Storm

(157)

An Amateur Photographer

(158)

A Variety of Looks

(159)

Goals

• Control over photographic look

• Transfer “look” from a model photo

For example, we want

with the look of

(160)

Aspects of Photographic Look

• Subject choice

• Framing and composition Î Specified by input photos

• Tone distribution and contrast ÎModified based on model photos

Input

Model

(161)

Tonal Aspects of Look

Ansel Adams Kenro Izu

(162)

Tonal aspects of Look - Global Contrast

Ansel Adams Kenro Izu

High Global Contrast Low Global Contrast

(163)

Tonal aspects of Look - Local Contrast

Variable amount of texture Texture everywhere

Ansel Adams Kenro Izu

(164)

Overview

Input Image Result

Model

• Transfer look between photographs

– Tonal aspects

(165)

Overview

Local contrast Global contrast

Result

• Separate global and local contrast

Input Image

(166)

Split

Local contrast Global contrast

Input Image

Result

Careful

combination

Post- process

Overview

(167)

Split

Global contrast

Input Image

Result

Careful

combination

Post- process

Overview

Local contrast

(168)

Split Global vs. Local Contrast

• Naïve decomposition: low vs. high frequency

– Problem: introduce blur & halos

Low frequency High frequency

Halo Blur

Global contrast Local contrast

(169)

Bilateral Filter

• Edge-preserving smoothing [Tomasi 98]

• We build upon tone mapping [Durand 02]

After bilateral filtering Residual after filtering

Global contrast Local contrast

(170)

Bilateral Filter

• Edge-preserving smoothing [Tomasi 98]

• We build upon tone mapping [Durand 02]

After bilateral filtering Residual after filtering

BASE layer DETAIL layer

Global contrast Local contrast

(171)

Global contrast

Input Image

Result

Careful

combination

Post- process Bilateral

Filter

Local contrast

(172)

Local contrast

Global contrast

Input Image

Result

Careful

combination

Post- process Bilateral

Filter

(173)

Global Contrast

• Intensity remapping of base layer

Input base Input intensity After remapping

Remapped intensity

(174)

Global Contrast (Model Transfer)

• Histogram matching

– Remapping function given input and model histogram Model

base

Input base

Output base

(175)

Local contrast Global contrast

Input Image

Result

Careful

combination

Post- process Bilateral

Filter

Intensity matching

(176)

Local contrast

Global contrast

Input Image

Result

Careful

combination

Post- process Bilateral

Filter

Intensity matching

(177)

Local Contrast: Detail Layer

• Uniform control:

– Multiply all values in the detail layer

Input Base + 3 × Detail

(178)

The amount of local contrast is not uniform

Smooth region

Textured region

(179)

Local Contrast Variation

• We define “textureness”: amount of local contrast

– at each pixel based on surrounding region

Smooth region

Ö Low textureness

Textured region

Ö High textureness

(180)

Input signal

“Textureness”: 1D Example

Smooth region

Ö Low textureness Textured region

Ö High textureness

High frequency H Amplitude |H| Edge-preserving filter

Smooth region

Ö Small high-frequency Textured region

Ö Large high-frequency Previous work:

Low pass of |H|

[Li 05, Su 05]

Low pass of |H|

[Li 05, Su 05]

(181)

Textureness

Input Textureness

(182)

Textureness Transfer

Step 1:

Histogram transfer

Hist. transfer Input

Input textureness textureness

Desired Desired textureness textureness Model

Model textureness textureness

x 0.5 x 2.7

x 4.3

Input detail Output detail Step 2:

Scaling detail layer (per pixel) to match

desired textureness

(183)

Local contrast Global contrast

Input Image

Result

Careful

combination

Post- process Bilateral

Filter

Intensity matching

Textureness matching

(184)

Local contrast Global contrast

Input Image

Result

Careful

combination

Post- process Bilateral

Filter

Intensity matching

Textureness matching

(185)

A Non Perfect Result

• Decoupled and large modifications (up to 6x)

ÎLimited defects may appear

input (HDR)

result after

global and local adjustments

參考文獻

相關文件

When the spatial dimension is N = 2, we establish the De Giorgi type conjecture for the blow-up nonlinear elliptic system under suitable conditions at infinity on bound

Project 1.3 Use parametric bootstrap and nonparametric bootstrap to approximate the dis- tribution of median based on a data with sam- ple size 20 from a standard normal

Wang, Solving pseudomonotone variational inequalities and pseudo- convex optimization problems using the projection neural network, IEEE Transactions on Neural Network,

It is based on the probabilistic distribution of di!erences in pixel values between two successive frames and combines the following factors: (1) a small amplitude

Two-scale Tone Management for Photographic Look Interactive Local Adjustment of Tonal Values Image-Based Material Editing..

• Johannes Kopf, Michael Cohen, Dani Lischinski, Matt Uyttendaele, Joint Bilateral Upsampling, SIGGRAPH 2007. • Jiawen Chen, Sylvain Paris, Fredo Durand, Real-time Edge-Aware

It’s based on the PZB service quality gap model and SERVQUAL scale, and together with the reference to other relevant research and service, taking into account

Based on the tourism and recreational resources and lodging industry in Taiwan, this paper conducts the correlation analysis on spatial distribution of Taiwan