Cameras
Digital Visual Effects, Spring 2006 Yung-Yu Chuang
2006/3/1
with slides by Fedro Durand, Brian Curless, Steve Seitz and Alexei Efros
Outline
• Pinhole camera
• Film camera
• Digital camera
• Video camera
• High dynamic range imaging
Camera trial #1
scene film
Put a piece of film in front of an object.
Pinhole camera
scene film
Add a barrier to block off most of the rays.
• It reduces blurring
• The pinhole is known as the aperture
• The image is inverted
barrier
pinhole camera
Shrinking the aperture
Why not making the aperture as small as possible?
• Less light gets through
• Diffraction effect
Shrinking the aperture
High-end commercial pinhole cameras
$200~$700
Adding a lens
scene lens film
“circle of confusion”
A lens focuses light onto the film
• There is a specific distance at which objects are “in focus”
• other points project to a “circle of confusion” in the image
Lenses
• Any object point satisfying this equation is in focus
• Thin lens applet:
http://www.phy.ntnu.edu.tw/java/Lens/lens_e.html
Thin lens equation:
Exposure = aperture + shutter speed
• Aperture of diameter D restricts the range of rays (aperture may be on either side of the lens)
• Shutter speed is the amount of time that light is allowed to pass through the aperture
F
Exposure
• Two main parameters:
– Aperture (in f stop)
– Shutter speed (in fraction of a second)
Effect of shutter speed
• Longer shutter speed => more light, but more motion blur
• Faster shutter speed freezes motion
Aperture
• Aperture is the diameter of the lens opening,
usually specified by f-stop, f/D, a fraction of the focal length.
– f/2.0 on a 50mm means that the aperture is 25mm – f/2.0 on a 100mm means that the aperture is 50mm
• When a change in f-stop occurs, the light is either doubled or cut in half.
• Lower f-stop, more light (larger lens opening)
• Higher f-stop, less light (smaller lens opening)
Depth of field
Changing the aperture size affects depth of field.
A smaller aperture increases the range in which the object is approximately in focus
See http://www.photonhead.com/simcam/
Exposure & metering
• The camera metering system measures how bright the scene is
• In Aperture priority mode, the photographer sets the aperture, the camera sets the shutter speed
• In Shutter-speed priority mode, the
photographers sets the shutter speed and the camera deduces the aperture
• In Program mode, the camera decides both
exposure and shutter speed (middle value more or less)
• In Manual mode, the user decides everything (but can get feedback)
Pros and cons of various modes
• Aperture priority
– Direct depth of field control
– Cons: can require impossible shutter speed (e.g. with f/1.4 for a bright scene)
• Shutter speed priority
– Direct motion blur control
– Cons: can require impossible aperture (e.g. when requesting a 1/1000 speed for a dark scene)
• Note that aperture is somewhat more restricted
• Program
– Almost no control, but no need for neurons
• Manual
– Full control, but takes more time and thinking
Distortion
• Radial distortion of the image
– Caused by imperfect lenses
– Deviations are most noticeable for rays that pass through the edge of the lens
No distortion Pin cushion Barrel
Correcting radial distortion
from Helmut Dersch
Film camera
scene lens & film
motor
aperture
& shutter
Digital camera
scene sensor
array lens &
motor
aperture
& shutter
• A digital camera replaces film with a sensor array
• Each cell in the array is a light-sensitive diode that converts photons to electrons
CCD v.s. CMOS
• CCD is less susceptible to noise (special process, higher fill factor)
• CMOS is more flexible, less expensive (standard process), less power consumption
CCD CMOS
Sensor noise
• Blooming
• Diffusion
• Dark current
• Photon shot noise
• Amplifier readout noise
SLR (Single-Lens Reflex)
• Reflex (R in SLR) means that we see through the same lens used to take the image.
• Not the case for compact cameras
SLR view finder
lens
Mirror
(when viewing) Mirror
(flipped for exposure)
Film/sensor Prism
Your eye
Light from scene
Color
So far, we’ve only talked about monochrome
sensors. Color imaging has been implemented in a number of ways:
• Field sequential
• Multi-chip
• Color filter array
• X3 sensor
Field sequential
Field sequential
Field sequential
Prokudin-Gorskii (early 1900’s)
Lantern projector
http://www.loc.gov/exhibits/empire/
Prokudin-Gorskii (early 1990’s)
Multi-chip
wavelength dependent
Embedded color filters
Color filters can be manufactured directly onto the photodetectors.
Color filter array
Color filter arrays (CFAs)/color filter mosaics Kodak DCS620x
Color filter array
Color filter arrays (CFAs)/color filter mosaics Bayer pattern
Bayer’s pattern
Demosaicking CFA’s
bilinear interpolation
original input linear interpolation
Demosaicking CFA’s
Constant hue-based interpolation (Cok)
Hue:
Interpolate G first
Demosaicking CFA’s
Median-based interpolation (Freeman)
1. Linear interpolation 2. Median filter on color
differences
Demosaicking CFA’s
Median-based interpolation (Freeman)
original input linear interpolation
color difference median filter reconstruction
Demosaicking CFA’s
Gradient-based interpolation (LaRoche-Prescott)
1. Interpolation on G
Demosaicking CFA’s
Gradient-based interpolation (LaRoche-Prescott)
2. Interpolation of color differences
Demosaicking CFA’s
bilinear Cok Freeman LaRoche
Demosaicking CFA’s
Generally, Freeman’s is the best, especially for natural images.
Foveon X3 sensor
• light penetrates to different depths for different wavelengths
• multilayer CMOS sensor gets 3 different spectral sensitivities
Color filter array
red green blue output
X3 technology
red green blue output
Foveon X3 sensor
X3 sensor Bayer CFA
Cameras with X3
Sigma SD10, SD9 Polaroid X530
Sigma SD9 vs Canon D30
Color processing
• After color values are recorded, more color processing usually happens:
– White balance
– Non-linearity to approximate film response or match TV monitor gamma
White Balance
automatic white balance warmer +3
Manual white balance
white balance with the white book
white balance with the red book
Autofocus
• Active
– Sonar – Infrared
• Passive
Digital camera review website
• http://www.dpreview.com/
• A cool video of digital camera illustration
Camcorder
Interlacing
with interlacing without interlacing
Deinterlacing
blend weave
Deinterlacing
Discard
(even field only or odd filed only)
Progressive scan
Hard cases
High dynamic range imaging
Camera pipeline
High dynamic range image
Short exposure
10
-610
610
-610
6Real world radiance
Picture intensity
dynamic range
Pixel value 0 to 255
Long exposure
10
-610
610
-610
6Real world radiance
Picture intensity
dynamic range
Pixel value 0 to 255
Real-world response functions
Camera calibration
• Geometric
– How pixel coordinates relate to directions in the world
• Photometric
– How pixel values relate to radiance amounts in the world
•• GeometricGeometric
–– How pixelHow pixel coordinatescoordinates relate to directionsrelate to directions in the in the world
world
•• PhotometricPhotometric
–– How pixelHow pixel valuesvalues relate to radiancerelate to radiance amounts in the amounts in the world
world
Camera is not a photometer
• Limited dynamic range
⇒ Perhaps use multiple exposures?
• Unknown, nonlinear response
⇒ Not possible to convert pixel values to radiance
• Solution:
– Recover response curve from multiple exposures, then reconstruct the radiance map
•• Limited dynamic rangeLimited dynamic range
⇒⇒ Perhaps use multiple exposures?Perhaps use multiple exposures?
•• Unknown, nonlinear response Unknown, nonlinear response
⇒⇒ Not possible to convert pixel values to radianceNot possible to convert pixel values to radiance
•• Solution:Solution:
–– Recover response curve from multiple exposures, Recover response curve from multiple exposures, then reconstruct the
then reconstruct the radiance mapradiance map
Varying exposure
• Ways to change exposure
– Shutter speed – Aperture
– Natural density filters
Shutter speed
• Note: shutter times usually obey a power series – each “stop” is a factor of 2
• ¼, 1/8, 1/15, 1/30, 1/60, 1/125, 1/250, 1/500, 1/1000 sec
Usually really is:
¼, 1/8, 1/16, 1/32, 1/64, 1/128, 1/256, 1/512, 1/1024 sec
• Note: shutter times usually obey a power series – each “stop” is a factor of 2
• ¼, 1/8, 1/15, 1/30, 1/60, 1/125, 1/250, 1/500, 1/1000 sec
Usually really is:
¼, 1/8, 1/16, 1/32, 1/64, 1/128, 1/256, 1/512, 1/1024 sec
Varying shutter speeds
Math for recovering response curve
Idea behind the math
Idea behind the math
Idea behind the math
Recovering response curve
• The solution can be only up to a scale, add a constraint
• Add a hat weighting function
Recovering response curve
• We want
If P=11, N~50
• We want selected pixels well distributed and sampled from constant region. They pick points by hand.
• It is an overdetermined system of linear equations and can be solved using SVD
Matlab code
Matlab code
Matlab code
Recovered response function
Constructing HDR radiance map
combine pixels to reduce noise and obtain a more reliable estimation
Reconstructed radiance map
What is this for?
• Human perception
• Vision/graphics applications
Easier HDR reconstruction
raw image =
12-bit CCD snapshot
Easier HDR reconstruction
exposure=radiance* ΔΔtt
exposure
ΔtΔt
• 12 bytes per pixel, 4 for each channel
sign exponent mantissa
PF
768 512 1
<binary image data>
Floating Point TIFF similar
Text header similar to Jeff Poskanzer’s .ppm image format:
Portable floatMap (.pfm)
(145, 215, 87, 149) =
(145, 215, 87) * 2^(149-128) = (1190000, 1760000, 713000)
Red Green Blue Exponent
Red Green Blue Exponent
32 bits / pixel 32 bits / pixel
(145, 215, 87, 103) =
(145, 215, 87) * 2^(103-128) =
(0.00000432, 0.00000641, 0.00000259)
Ward, Greg. "Real Pixels," in Graphics Gems IV, edited by James Arvo, Academic Press, 1994
Radiance format (.pic, .hdr, .rad)
ILM’s OpenEXR (.exr)
• 6 bytes per pixel, 2 for each channel, compressed
sign exponent mantissa
• Several lossless compression options, 2:1 typical
• Compatible with the “half” datatype in NVidia's Cg
• Supported natively on GeForce FX and Quadro FX
• Available at http://www.openexr.net/
Radiometric self calibration
• Assume that any response function can be modeled as a high-order polynomial
Space of response curves
Space of response curves
Assorted pixel
Assorted pixel
Assorted pixel
Assignment #1 HDR image assemble
• It you have not subscribed the mailing list, please do so.
• Will be announced around Friday through the mailing list
• You will use a tripod to take multiple photos with different shutter speeds. Write a program to recover the response curve and radiance
map. We will provide image I/O library.
Furthermore, apply some tone mapping operation on your photograph.
References
• http://www.howstuffworks.com/digital-camera.htm
• http://electronics.howstuffworks.com/autofocus.htm
• Ramanath, Snyder, Bilbro, and Sander. Demosaicking Methods for Bayer Color Arrays, Journal of Electronic Imaging, 11(3), pp306-315.
• Paul E. Debevec, Jitendra Malik, Recovering High Dynamic Range Radiance Maps from Photographs, SIGGRAPH 1997.
• http://www.worldatwar.org/photos/whitebalance/ind ex.mhtml
• http://www.100fps.com/