• 沒有找到結果。

Computational Photography

N/A
N/A
Protected

Academic year: 2022

Share "Computational Photography"

Copied!
51
0
0

加載中.... (立即查看全文)

全文

(1)

Computational Photography

Digital Visual Effects, Spring 2007 Yung-Yu Chuang

2007/5/22

with slides by Fredo Durand, Ramesh Raskar, Sylvain Paris, Soonmin Bae

Computational photography

wikipedia:

Computational photography refers broadly to computational imaging techniques that enhance or extend the capabilities of digital photography.

The output of these techniques is an ordinary photograph, but one that could not have been taken by a traditional camera.

What is computational photography

• Convergence of image processing, computer vision, computer graphics and photography

• Digital photography:

Simply mimics traditional sensors and recording by digital technology

Involves only simple image processing

• Computational photography

More elaborate image manipulation, more computation

New types of media (panorama, 3D, etc.)

Camera design that take computation into account

Computational photography

• One of the most exciting fields.

• Symposium on Computational Photography and Video, 2005

• Full-semester courses in MIT, CMU, Stanford, GaTech, University of Delaware

• A new book by Raskar and Tumblin is coming out in SIGGRAPH 2007.

(2)

Siggraph 2006 Papers (16/86=18.6%)

Hybrid Images Drag-and-Drop Pasting

Two-scale Tone Management for Photographic Look Interactive Local Adjustment of Tonal Values Image-Based Material Editing

Flash Matting

Natural Video Matting using Camera Arrays Removing Camera Shake From a Single Photograph Coded Exposure Photography: Motion Deblurring Photo Tourism: Exploring Photo Collections in 3D AutoCollage

Photographing Long Scenes With Multi-Viewpoint Panoramas Projection Defocus Analysis for Scene Capture and Image Display Multiview Radial Catadioptric Imaging for Scene Capture Light Field Microscopy

Fast Separation of Direct and Global Components of a Scene Using High Frequency Illumination

Siggraph 2007 Papers (23/108=21.3%)

Image Deblurring with Blurred/Noisy Image Pairs Photo Clip Art

Scene Completion Using Millions of Photographs

Soft Scissors: An Interactive Tool for Realtime High Quality Matting Seam Carving for Content-Aware Image Resizing

Detail-Preserving Shape Deformation in Image Editing Veiling Glare in High Dynamic Range Imaging

Do HDR Displays Support LDR content? A Psychophysical Evaluation Ldr2hdr: On-the-fly Reverse Tone Mapping of Legacy Video and Photographs Rendering for an Interactive 360-Degree Light Field Display

Multiscale Shape and Detail Enhancement from Multi-light Image Collections Post-Production Facial Performance Relighting Using Reflectance Transfer Active Refocusing of Images and Videos

Multi-aperture Photography

Dappled Photography: Mask-Enhanced Cameras for Heterodyned Light Fields and Coded Aperture Refocusing

Image and Depth from a Conventional Camera with a Coded Aperture Capturing and Viewing Gigapixel Images

Efficient Gradient-Domain Compositing Using Quadtrees Image Upsampling via Imposed Edges Statistics Joint Bilateral Upsampling

Factored Time-Lapse Video Computational Time-Lapse Video

Real-Time Edge-Aware Image Processing With the Bilateral Grid

Scope

• We can’t yet set its precise definition. The following are scopes of what researchers are exploring in this field.

– Record a richer visual experience

– Overcome long-standing limitations of conventional cameras

– Enable new classes of visual signal – Enable synthesis impossible photos

Scope

Image formation

Color and color perception

Demosaicing

(3)

Scope

Panoramic imaging

Image and video registration

Spatial warping operations

Scope

High Dynamic Range Imaging

Bilateral filtering and HDR display

Matting

Scope

Active flash methods

Lens technology

Depth and defocus

No-flash

Flash

our result

Removing Photography Artifacts using Gradient Projection and Flash-Exposure Sampling

(4)

Continuous flash

Flash = 0.0

Flash = 0.3 Flash = 0.7 Flash = 1.4

Flash = 1.0

Flash matting

Depth Edge Detection and Stylized

Rendering Using a Multi-Flash Camera Motion-Based Motion Deblurring

(5)

Removing Camera Shake from a

Single Photograph Motion Deblurring using Fluttered Shutter

Scope

Future cameras

Plenoptic function and light fields

Scope

Gradient image manipulation

(6)

Scope

Taking great pictures

Art Wolfe Ansel Adams

Scope

• Non-parametric image synthesis, inpainting,

analogies

Scope

Motion analysis

Image Inpainting

(7)

Object Removal by

Exemplar-Based Inpainting

Image Completion with Structure Propagation

Lazy snapping Lazy snapping

• Pre-segmentation

• Boundary Editing

(8)

Grab Cut - Interactive Foreground

Extraction using Iterated Graph Cuts Image Tools

• Gradient domain operations,

Tone mapping, fusion and matting

• Graph cuts,

Segmentation and mosaicing

• Bilateral and Trilateral filters,

Denoising, image enhancement

Gradient domain operators

Intensity Gradient in 1D

I(x) 1

105

G(x) 1

105

Intensity Gradient

Gradient at x,

G(x) = I(x+1)- I(x)

Forward Difference

(9)

Reconstruction from Gradients

I(x) 1

105

Intensity

G(x) 1

105

Gradient

? ?

For n intensity values, about n gradients

Reconstruction from Gradients

I(x) 1

105

Intensity

G(x) 1

105

Gradient

1D Integration

I(x) = I(x-1) + G(x) Cumulative sum

?

1D case with constraints

Seamlessly paste onto

Just add a linear function so that the boundary condition is respected

Discrete 1D example: minimization

• Copy to

• Min ((f2-f1)-1)2

• Min ((f3-f2)-(-1))2

• Min ((f4-f3)-2)2

• Min ((f5-f4)-(-1))2

• Min ((f6-f5)-(-1))2

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

? ? ? ?

With

f

1

=6

f

6

=1

(10)

1D example: minimization

• Copy to

• Min ((f2-6)-1)2 ==> f22+49-14f2

• Min ((f3-f2)-(-1))2 ==> f32+f22+1-2f3f2 +2f3-2f2

• Min ((f4-f3)-2)2 ==> f42+f32+4-2f3f4 -4f4+4f3

• Min ((f5-f4)-(-1))2 ==> f52+f42+1-2f5f4 +2f5-2f4

• Min ((1-f5)-(-1))2 ==> f52+4-4f5

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

? ? ? ?

1D example: big quadratic

Copy to

Min (f22+49-14f2

+ f32+f22+1-2f3f2+2f3-2f2 + f42+f32+4-2f3f4-4f4+4f3 + f52+f42+1-2f5f4+2f5-2f4 + f52+4-4f5)

Denote it Q

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

? ? ? ?

1D example: derivatives

Copy to

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

? ? ? ?

Min (f22+49-14f2

+ f32+f22+1-2f3f2+2f3-2f2 + f42+f32+4-2f3f4-4f4+4f3 + f52+f42+1-2f5f4+2f5-2f4 + f52+4-4f5)

Denote it Q

1D example: set derivatives to zero

Copy to

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

? ? ? ?

==>

(11)

1D example

• Copy to

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

1 2 3 4 5 6 7

1D example: remarks

Copy to

Matrix is sparse

Matrix is symmetric

Everything is a multiple of 2

because square and derivative of square

Matrix is a convolution (kernel -2 4 -2)

Matrix is independent of gradient field. Only RHS is

Matrix is a second derivative

0 1 2 3 4 5 6

0

1 23 4567 -1 -1 -1

+2 +1

0 1 2 3 4 5 6

0

123 4 567

Grad X

Grad Y

Intensity Gradient in 2D

Gradient at x,y as Forward Differences Gx(x,y) = I(x+1 , y)- I(x,y) Gy(x,y) = I(x , y+1)- I(x,y) G(x,y) = (Gx, Gy)

Grad X

Grad Y

Integration2D

Image Intensity Gradients in 2D

Sanity Check:

Recovering Original Image

Solve Poisson Equation, 2D linear system

(12)

Grad X

Grad Y

New Grad X

New Grad Y

Integration2D

Intensity Gradient Manipulation

Gradient Processing

A Common Pipeline

Modify Gradients

2D case with constraints

Given vector field v (pasted gradient), find the value of f in unknown region that optimize:

Pasted gradient Mask

Background unknown region

Poisson image editing

Problems with direct cloning

From Perez et al. 2003

(13)

Solution: clone gradient Result

(14)

Reduce big gradients

• Dynamic range compression

• Fattal et al. 2002

Seamless Image Stitching in the Gradient Domain

Anat Levin, Assaf Zomet, Shmuel Peleg, and Yair Weiss

http://www.cs.huji.ac.il/~alevin/papers/eccv04-blending.pdf http://eprints.pascal-network.org/archive/00001062/01/tips05- blending.pdf

• Various strategies (optimal cut, feathering)

(15)

Gradient tone mapping

• Fattal et al. Siggraph 2002

Slide from Siggraph 2005 by Raskar (Graphs by Fattal et al.)

Gradient attenuation

From Fattal et al.

Fattal et al. Gradient tone mapping Poisson Matting

Sun et al. Siggraph 2004

Assume gradient of F & B is negligible

Plus various image-editing tools to refine matte

(16)

Interactive Local Adjustment of Tonal Values

Dani Lischinski, Zeev Farbman The Hebrew University

Matt Uyttendaele, Richard Szeliski Microsoft Research

Background (1)

Dodging

Burning brushes

Darkroom

Camera shutter ---Æ Photograph

⎧⎨

Tool

Only!

But, …

It is tedious, time-consuming and painstaking!

Background (2)

• A large arsenal of adjustment tools

• Hard to master these tools

– To learn, use

• Tedious and time-consuming

– Professional ability, experienced skill – Too many layer masks

• Incapable in some requirements [Adobe Photoshop CS2, 2005]

Background (2)

[Adobe Photoshop CS2, 2005]

Result Layer mask

Original image

(17)

Related Work: Tone Mapping Operators

• Global operators

[Ward Larson et al. 1997; Reinhard et al. 2002; Drago et al. 2003]

– Usually fast

• Local operators

[Fattal et al. 2002; Reinhard et al. 2002;

Li et al. 2005] …

– Better at preserving local contrasts – Introduce visual artifacts sometimes

Limitations of Tone Mapping Operators

• Lack of direct local control

– Can’t directly manipulate a particular region

• Not guaranteed to converge to a subjectively satisfactory result

– Involves several trial-and-error iterations – Change the entire image each iteration

(18)

Algorithm Overview

1.Load a digital negative, a camera RAW file, an HDR radiance map, or an ordinary image

2.Indicate regions in the image that require adjusting

3.Experiment with the available adjustment

parameters until a satisfactory result is obtained in the desired regions

4.Iterate 2 and 3 until a satisfactory image

(19)
(20)

An Example

(21)

Region Selection: Strokes and Brushes

• Basic brush

• Luminance brush

weight=1, for the selected pixels in the brush;

weight=0, else

Region Selection: Luminance Brush

Region Selection: Strokes and Brushes

• Basic brush

• Luminance brush

• Lumachrome brush (chromaticity) –

• Over-exposure brush

• Under-exposure brush

Constraint Propagation

User strokes Adjusted exposure

(22)

Image-guided Energy Minimization

Data term + smoothing term

Image-guided Energy Minimization

data term + smoothing term

: log-luminance channel : sensitivity factor

: a small zero-division constant : a balance factor

Default:

Standard Finite Differences Fast Approximate Solution

Solved iteratively by [Saad 2003]

preconditioned conjugate gradients (PCG)

(23)

Interactive Local Adjustment of Tonal Value

arg min ( )( ( ) ( ))2 ( , )

f

f = w f g +λ h ∇ ∇f L

∑ ∑

X X

x x x

f

Results

Graph cut

Graph cut

• Interactive image segmentation using graph cut

• Binary label: foreground vs. background

• User labels some pixels

similar to trimap, usually sparser

• Exploit

Statistics of known Fg & Bg Smoothness of label

• Turn into discrete graph optimization

Graph cut (min cut / max flow)

F

B F F F F B

B B

(24)

Energy function

Labeling: one value per pixel, F or B

Energy(labeling) = data + smoothness

Very general situation Will be minimized

Data: for each pixel

Probability that this color belongs to F (resp. B) Similar in spirit to Bayesian matting

Smoothness (aka regularization):

per neighboring pixel pair

Penalty for having different label Penalty is downweighted if the two

pixel colors are very different Similar in spirit to bilateral filter

One labeling (ok, not best)

Data

Smoothness

Data term

• A.k.a regional term

(because integrated over full region)

• D(L)=

Σ

i -log h[Li](Ci)

Where i is a pixel

Li is the label at i (F or B), Ci is the pixel value

h[Li] is the histogram of the observed Fg (resp Bg)

• Note the minus sign

Hard constraints

• The user has provided some labels

• The quick and dirty way to include

constraints into optimization is to replace the data term by a huge penalty if not respected.

• D(L_i)=0 if respected

• D(L_i) = K if not respected

e.g. K=- #pixels

Smoothness term

• a.k.a boundary term, a.k.a. regularization

• S(L)=

Σ

{j, i} in N B(Ci,Cj) δ(Li-Lj)

• Where i,j are neighbors

e.g. 8-neighborhood

(but I show 4 for simplicity)

• δ(Li-Lj) is 0 if Li=Lj, 1 otherwise

• B(Ci,Cj) is high when Ci and Cj are similar, low if there is a discontinuity between those two pixels

e.g. exp(-||Ci-Cj||2/2σ2) where σ can be a constant

or the local variance

• Note positive sign

(25)

Optimization

• E(L)=D(L)+λ S(L)

• λ is a black-magic constant

• Find the labeling that minimizes E

• In this case, how many possibilities?

29 (512)

We can try them all!

What about megapixel images?

Labeling as a graph problem

• Each pixel = node

• Add two nodes F & B

• Labeling: link each pixel to either F or B F

B

Desired result

Data term

• Put one edge between each pixel and F & G

• Weight of edge = minus data term

Don’t forget huge weight for hard constraints Careful with sign

B F

Smoothness term

• Add an edge between each neighbor pair

• Weight = smoothness term

B F

(26)

Min cut

• Energy optimization equivalent to min cut

• Cut: remove edges to disconnect F from B

• Minimum: minimize sum of cut edge weight

B

F cut

Min cut <=> labeling

• In order to be a cut:

For each pixel, either the F or G edge has to be cut

• In order to be minimal

Only one edge label per pixel can be cut (otherwise could be added)

B

F cut

Computing a multiway cut

• With 2 labels: classical min-cut problem

Solvable by standard flow algorithms

polynomial time in theory, nearly linear in practice

More than 2 terminals: NP-hard

[Dahlhaus et al., STOC ‘92]

• Efficient approximation algorithms exist

Within a factor of 2 of optimal

Computes local minimum in a strong sense

even very large moves will not improve the energy

Yuri Boykov, Olga Veksler and Ramin Zabih, Fast Approximate Energy Minimization via Graph Cuts, International Conference on Computer Vision, September 1999.

Move examples

Starting point

Red-blue swap move

Green expansion move

(27)

GrabCut GrabCut

Interactive Foreground Extraction Interactive Foreground Extraction

using Iterated Graph Cuts using Iterated Graph Cuts

Carsten

Carsten RotherRother Vladimir Kolmogorov Vladimir Kolmogorov

Andrew Blake Andrew Blake

Microsoft Research Cambridge Microsoft Research Cambridge--UKUK

Agrawala et al, Digital Photomontage, Siggraph 2004

Source images Brush strokes Computed labeling

Composite

(28)

Brush strokes Computed labeling

Graph Cuts for Segmentation and Mosaicing

Interactive Digital Photomontage

• Extended depth of field

Interactive Digital Photomontage

• Relighting

Interactive Digital Photomontage

(29)

Bilateral filtering

[Ben Weiss, Siggraph 2006]

[Ben Weiss, Siggraph 2006]

Input

Input Log(IntensityLog(Intensity) ) Bilateral Smoothing Bilateral Smoothing Gaussian

Gaussian Smoothing Smoothing

Image Denoising

noisy image naïve denoising

Gaussian blur better denoising edge-preserving filter

Smoothing an image without blurring its edges.

A Wide Range of Options

• Diffusion, Bayesian, Wavelets…

– All have their pros and cons.

• Bilateral filter

– not always the best result [Buades 05] but often good – easy to understand, adapt and set up

Start with Gaussian filtering

• Here, input is a step function + noise

output input

=

J f

I

(30)

Start with Gaussian filtering

• Spatial Gaussian f

output input

=

J

f

I

Start with Gaussian filtering

• Output is blurred

output input

J

= f I

Gaussian filter as weighted average

• Weight of ξ depends on distance to x

) , (x ξ

f I(ξ)

output input

= ) (x

J

ξ

x x

ξ

The problem of edges

• Here, “pollutes” our estimate J(x)

• It is too different

x

) (

ξ

I

) (x I

) , (xξ

f I(ξ)

= ) (x

J

ξ

output input

(31)

Principle of Bilateral filtering

[Tomasi and Manduchi 1998]

• Penalty g on the intensity difference

= ) (x

J

f(x,ξ) g(I(ξ)−I(x)) I(ξ) ) ξ

( 1

x k

x I (x )

) ( ξ I

output input

Bilateral filtering

[Tomasi and Manduchi 1998]

• Spatial Gaussian f

= ) (x

J

f ( x , ξ )

g(I(ξ)I(x)) I(ξ)

) ξ

( 1

x k

x

output input

Bilateral filtering

[Tomasi and Manduchi 1998]

• Spatial Gaussian f

• Gaussian g on the intensity difference

= ) (x

J

f(x,ξ) g(I(ξ)−I(x)) I(ξ) ) ξ

( 1

x k

x

output input output input

Normalization factor

[Tomasi and Manduchi 1998]

• k(x)=

= ) (x

J

I(ξ)

) ξ

( 1

x k

x

) , (x ξ

f g(I(ξ)−I(x))

ξ

) , (x ξ

f g(I(ξ)−I(x))

(32)

output input

Bilateral filtering is non-linear

[Tomasi and Manduchi 1998]

• The weights are different for each output pixel

= ) (x

J

f(x,ξ) g(I(ξ)−I(x)) I(ξ) ) ξ

( 1

x k

x x

Many Applications based on Bilateral Filter

Tone Mapping [Durand 02]

Virtual Video Exposure [Bennett 05]

And many others…

Flash / No-Flash [Eisemann 04, Petschnigg 04]

[Petschnigg 04]

Tone Management [Bae 06]

Advantages of Bilateral Filter

• Easy to understand

– Weighted mean of nearby pixels

• Easy to adapt

– Distance between pixel values

• Easy to set up

– Non-iterative

But Bilateral Filter is Nonlinear

• Slow but some accelerations exist:

– [Elad 02]: Gauss-Seidel iterations

• Only for many iterations

– [Durand 02, Weiss 06]: fast approximation

• No formal understanding of accuracy versus speed

• [Weiss 06]: Only box function as spatial kernel

(33)

A Fast Approximation of the Bilateral Filter using a Signal Processing

Approach

Sylvain Paris and Frédo Durand

Computer Science and Artificial Intelligence Laboratory Massachusetts Institute of Technology

Definition of Bilateral Filter

• [Smith 97, Tomasi 98]

• Smoothes an image and preserves edges

• Weighted average of neighbors

• Weights

– Gaussian on space distance – Gaussian on range distance – sum to 1

space range

Input Result

Contributions

• Link with

linear filtering

Fast

and

accurate

approximation

Intuition on 1D Signal

BF

(34)

p

Intuition on 1D Signal

Weighted Average of Neighbors

• Near and similar pixels have influence.

• Far pixels have no influence.

• Pixels with different value have no influence.

weights applied to pixels

p

Link with Linear Filtering

1. Handling the Division

sum of weights

Handling the division with a

projective space

.

Formalization: Handling the Division

• Normalizing factor as homogeneous coordinate

• Multiply both sides by

• Normalizing factor as homogeneous coordinate

• Multiply both sides by

Formalization: Handling the Division

• Similar to homogeneous coordinates in projective space

• Division delayed until the end

• Next step: Adding a dimension to make a convolution appear

with Wq=1

(35)

space range p

Link with Linear Filtering

2. Introducing a Convolution

q

space: 1D Gaussian

× range: 1D Gaussian combination: 2D Gaussian space: 1D Gaussian

× range: 1D Gaussian combination: 2D Gaussian

p

Link with Linear Filtering

2. Introducing a Convolution

q

space: 1D Gaussian

×range: 1D Gaussian combination: 2D Gaussian space: 1D Gaussian

×range: 1D Gaussian combination: 2D Gaussian

space x range

Corresponds to a 3D Gaussian on a 2D image.

Link with Linear Filtering

2. Introducing a Convolution

space-range Gaussian black = zero

sum all values

sum all values multiplied by kernel Ö convolution

space-range Gaussian

result of the convolution

Link with Linear Filtering

2. Introducing a Convolution

(36)

Link with Linear Filtering

2. Introducing a Convolution

space-range Gaussian

result of the convolution

higher dimensional functions

Gaussian convolution

division

slicing

w i w

Reformulation: Summary

1. Convolution in higher dimension

• expensive but well understood (linear, FFT, etc)

2. Division and slicing

• nonlinear but simple and pixel-wise

Exact reformulation Exact reformulation

higher dimensional functions

Gaussian convolution

division

slicing

Low-pass filter Low-pass filter

Almost only low freq.

High freq.

negligible Almost only

low freq.

High freq.

negligible

w i w

(37)

higher dimensional functions

Gaussian convolution

division

slicing

w i w

D O W N S A M P L E

U P S A M P L E

Almost no information loss Almost no information loss

Fast Convolution by Downsampling

• Downsampling cuts frequencies above Nyquist limit

– Less data to process – But induces error

• Evaluation of the approximation

– Precision versus running time – Visual accuracy

Accuracy versus Running Time

• Finer sampling increases accuracy.

• More precise than previous work.

finer sampling

PSNR as function of Running Time

Digital photograph 1200 × 1600

Straightforward implementation is over 10 minutes.

input exact BF our result prev. work

1200 × 1600

• Comparison with previous work [Durand 02]

– running time = 1s for both techniques

0

difference 0.1

with exact computation (intensities in [0:1])

Visual Results

(38)

input exact BF our result prev. work

1200 × 1600

• Comparison with previous work [Durand 02]

– running time = 1s for both techniques

0

difference 0.1

with exact computation (intensities in [0:1])

Visual Results

input our result exact BF prev. work

1200 × 1600

• Comparison with previous work [Durand 02]

– running time = 1s for both techniques

0

difference 0.1

with exact computation (intensities in [0:1])

Visual Results

input exact BF our result

difference with exact computation (intensities in [0:1])

prev. work

1200 × 1600

• Comparison with previous work [Durand 02]

– running time = 1s for both techniques

0 0.1

Visual Results

input prev. work our result exact BF

1200 × 1600

• Comparison with previous work [Durand 02]

– running time = 1s for both techniques

0

difference 0.1

with exact computation (intensities in [0:1])

Visual Results

(39)

Discussion

• Higher dimension Ö advantageous formulation

– akin to Level Sets with topology – our approach: isolate nonlinearities

– dimension increase largely offset by downsampling

• Space-range domain already appeared

– [Sochen 98, Barash 02]: image as an embedded manifold – new in our approach: image as a dense function

Conclusions

Practical gain

• Interactive running time

• Visually similar results

• Simple to code (100 lines)

Theoretical gain

• Link with linear filters

• Separation linear/nonlinear

• Signal processing framework

higher dimension Ö “better” computation higher dimension Ö “better” computation

Two-scale Tone Management for Photographic Look

Soonmin Bae, Sylvain Paris, and Frédo Durand MIT CSAIL

Ansel Adams

Ansel Adams, Clearing Winter Storm

(40)

An Amateur Photographer A Variety of Looks

Goals

• Control over photographic look

• Transfer “look” from a model photo

For example,

we want

with the look of

Aspects of Photographic Look

• Subject choice

• Framing and composition Î Specified by input photos

• Tone distribution and contrast ÎModified based on model photos

Input

Model

(41)

Tonal Aspects of Look

Ansel Adams Kenro Izu

Tonal aspects of Look - Global Contrast

Ansel Adams Kenro Izu

High Global Contrast Low Global Contrast

Tonal aspects of Look - Local Contrast

Variable amount of texture Texture everywhere

Ansel Adams Kenro Izu

Overview

Input Image Result

Model

• Transfer look between photographs

– Tonal aspects

(42)

Overview

Local contrast Global contrast

Result

• Separate global and local contrast

Input Image

Split

Local contrast Global contrast

Input Image

Result

Careful combination

Post- process

Overview

Split

Global contrast

Input Image

Result

Careful combination

Post- process

Overview

Local contrast

Split Global vs. Local Contrast

• Naïve decomposition: low vs. high frequency

– Problem: introduce blur & halos

Low frequency High frequency

Halo Blur

Global contrast Local contrast

(43)

Bilateral Filter

• Edge-preserving smoothing [Tomasi 98]

• We build upon tone mapping [Durand 02]

After bilateral filtering Residual after filtering

Global contrast Local contrast

Bilateral Filter

• Edge-preserving smoothing [Tomasi 98]

• We build upon tone mapping [Durand 02]

After bilateral filtering Residual after filtering

BASE layer DETAIL layer

Global contrast Local contrast

Global contrast

Input Image

Result

Careful combination

Post- process Bilateral

Filter

Local contrast Local contrast

Global contrast

Input Image

Result

Careful combination

Post- process Bilateral

Filter

(44)

Global Contrast

• Intensity remapping of base layer

Input base Input intensity After remapping Remapped

intensity

Global Contrast (Model Transfer)

• Histogram matching

– Remapping function given input and model histogram Model

base

Input base

Output base

Local contrast Global contrast

Input Image

Result

Careful combination

Post- process Bilateral

Filter

Intensity matching

Local contrast

Global contrast

Input Image

Result

Careful combination

Post- process Bilateral

Filter

Intensity matching

(45)

Local Contrast: Detail Layer

• Uniform control:

– Multiply all values in the detail layer

Input Base + 3 × Detail

The amount of local contrast is not uniform

Smooth region

Textured region

Local Contrast Variation

• We define “textureness”: amount of local contrast

– at each pixel based on surrounding region

Smooth region Ö Low textureness

Textured region Ö High textureness

Input signal

“Textureness”: 1D Example

Smooth region Ö Low textureness Textured region Ö High textureness

High frequency H Amplitude |H| Edge-preserving filter

Smooth region

Ö Small high-frequency Textured region

Ö Large high-frequency Previous work:

Low pass of |H|

[Li 05, Su 05]

Low pass of |H|

[Li 05, Su 05]

(46)

Textureness

Input Textureness

Textureness Transfer

Step 1:

Histogram transfer

Hist. transfer Input

Input textureness textureness

Desired Desired textureness textureness Model

Model textureness textureness

x 0.5 x 2.7

x 4.3

Input detail Output detail Step 2:

Scaling detail layer (per pixel) to match

desired textureness

Local contrast Global contrast

Input Image

Result

Careful combination

Post- process Bilateral

Filter

Intensity matching

Textureness matching

Local contrast Global contrast

Input Image

Result

Careful combination

Post- process Bilateral

Filter

Intensity matching

Textureness matching

(47)

A Non Perfect Result

• Decoupled and large modifications (up to 6x) ÎLimited defects may appear

input (HDR)

result after

global and local adjustments

Intensity Remapping

• Some intensities may be outside displayable range.

Î Compress histogram to fit visible range.

corrected result remapped

intensities initial

result

Preserving Details

1. In the gradient domain:

– Compare gradient amplitudes of input and current – Prevent extreme reduction & extreme increase

2. Solve the Poisson equation.

corrected result remapped

intensities initial

result

Effect of Detail Preservation

uncorrected result corrected result

(48)

Local contrast Global contrast

Input Image

Result

Post- process Bilateral

Filter

Intensity matching

Textureness matching

Constrained Poisson

Local contrast Global contrast

Input Image

Result

Bilateral Filter

Intensity matching

Textureness matching

Constrained Poisson

Post- process

Additional Effects

• Soft focus (high frequency manipulation)

• Film grain (texture synthesis [Heeger 95])

• Color toning (chrominance = f (luminance))

before effects

after effects model

Intensity matching

Bilateral Filter

Local contrast Global contrast

Input Image

Result

Textureness matching

Constrained Poisson

Soft focus Toning

Grain

(49)

Intensity matching

Bilateral Filter

Local contrast Global contrast

Input Image

Result

Textureness matching

Constrained Poisson

Soft focus Toning

Grain

Recap

Results

User provides input and model photographs.

Î Our system automatically produces the result.

Running times:

– 6 seconds for 1 MPixel or less – 23 seconds for 4 MPixels

ƒ multi-grid Poisson solver and fast bilateral filter [Paris 06]

Input Model

Result Input Result

(50)

Input Model Result

Input Input

Our result Our result Na

Naïïve Histogram Matchingve Histogram Matching

Model Model

Snapshot

Snapshot, Alfred Stieglitz, Alfred Stieglitz

Comparison with Naïve Histogram Matching

Local contrast, sharpness unfaithful

Input Input

Our Result Our Result

Model Model

Clearing Winter Storm, Ansel Adams

Histogram Matching Histogram Matching

Comparison with Naïve Histogram Matching

Local contrast too low

Color Images

• Lab color space: modify only luminance

Input

Input OutputOutput

(51)

Limitations

• Noise and JPEG artifacts

– amplified defects

• Can lead to unexpected

results if the image content is too different from the model

– Portraits, in particular, can suffer

Conclusions

• Transfer “look” from a model photo

• Two-scale tone management

– Global and local contrast

– New edge-preserving textureness – Constrained Poisson reconstruction – Additional effects

References

• Patrick Perez, Michel Gangnet, Andrew Blake, Poisson Image Editing, SIGGRAPH 2003.

• Dani Lischinski, Zeev Farbman, Matt Uytendaelle and Richard Szeliski. Interactive Local Adjustment of Tonal Values. SIGGRAPH 2006.

• Carsten Rother, Andrew Blake, Vladimir Kolmogorov, GrabCut - Interactive Foreground Extraction Using Iterated Graph Cuts, SIGGRAPH 2004.

• Aseem Agarwala, Mira Dontcheva, Maneesh Agrawala, Steven Drucker, Alex Colburn, Brian Curless, David H. Salesin, Michael F.

Cohen, Interactive Digital Photomontage, SIGGRAPH 2004.

• Sylvain Paris and Fredo Durand. A Fast Approximation of the Bilateral Filter using a Signal Processing Approach. ECCV 2006.

• Soonmin Bae, Sylvain Paris and Fredo Durand. Two-scale Tone Management for Photographic Look. SIGGRAPH 2006.

參考文獻

相關文件

We obtain several corollaries regarding the computational power needed by the row player to guarantee a good expected payoff against randomized circuits (acting as the column player)

Department of Electrical Engineering, National Cheng Kung University In this thesis, an embedded system based on SPCE061A for interactive spoken dialogue learning system (ISDLS)

• Paul Debevec, Rendering Synthetic Objects into Real Scenes:. Bridging Traditional and Image-based Graphics with Global Illumination and High Dynamic

Ongoing Projects in Image/Video Analytics with Deep Convolutional Neural Networks. § Goal – Devise effective and efficient learning methods for scalable visual analytic

It is based on the probabilistic distribution of di!erences in pixel values between two successive frames and combines the following factors: (1) a small amplitude

computational &amp; mathematical thinking by task-based teaching as a means to provide an interactive environment for learners to achieve the learning outcomes; and (ii) how

• Tone distribution and contrast ÎModified based on model

• Johannes Kopf, Michael Cohen, Dani Lischinski, Matt Uyttendaele, Joint Bilateral Upsampling, SIGGRAPH 2007. • Jiawen Chen, Sylvain Paris, Fredo Durand, Real-time Edge-Aware