• 沒有找到結果。

In this study, with hybrid image paradigm and face images constructed from

different illumination conditions, we demonstrated the role of asymmetric low spatial

frequency information in face discrimination. The asymmetric low spatial frequency

information has a strong influence on face perception as illustrated by its ability to

change the perceived gender of the hybrid faces. In addition, such effect increased as the

lighting direction shifts sideward. In contrast, the symmetric low spatial frequency

information had little, if any, effect on face discrimination. In addition, the asymmetric

low spatial frequency information also affected the perceived depth of a face. Again,

this effect was more pronounced as the lighting direction shifts sideward. Hence, these

results are consistent with the notion that the sideward shift of lighting directions

provides more shading information that increases the perceived depth of

shape-from-shading and in turn improves the facial discriminability in the observers.

23

References

Abrosoft. (2009). FantaMorph 4.0. Beijing: Abrosoft, Inc.

Autodesk, Inc. (2009). 3ds Max 2009. San Rafael, CA: Autodesk, Inc.

Brainard, D. H. (1997). The psychophysics toolbox. Spatial vision, 10(4), 433-436.

Braje, W. L., Kersten, D., Tarr, M. J., & Troje, N. F. (1998). Illumination effects in face

recognition. Psychobiology, 26(4), 371-380.

Bruce, V., Healey, P., Burton, A. M., Doley, T., Coombes, A., & Linney, A. (1991).

Recognising facial surfaces. Perception, 20(6), 755-769.

Bruce, V., Burton, A. M., Hanna, E., Healey, P., Mason, O., Coombes, A., Fright, R., &

Linney, A. (1993). Sex discrimination: How do we tell the difference between

male and female faces? Perception, 22(2), 131 – 152.

Costen, N. P., Parker, D. M., & Craw, I. (1996). Effects of high-pass and low-pass

spatial filtering on face identification. Perception & Psychophysics, 58(4),

602-612.

Gibson, J. J. (1950). The perception of the visual world. Boston: Houghton Mifflin.

Gilad, S., Meng, M., & Sinha, P. (2009). Role of ordinal contrast relationships in face

encoding. Proceedings of the National Academy of Sciences, 106(13),

5353-5358.

24

Gold, J., Bennett, P. J., & Sekuler, A. B. (1999). Identification of band-pass filtered

letters and faces by human and ideal observers. Vision Research, 39(21),

3537-3560.

Grammer, K., & Thornhill, R. (1994). Human (Homo sapiens) facial attractiveness and

sexual selection: The role of symmetry and averageness. Journal of

Comparative Psychology, 108(3), 233-242.

Gregory, R. L. (1970). The Intelligent Eye. London: Weidenfeld & Nicolson.

Harmon, L. D., & Julesz, B. (1973). Masking in visual recognition: Effects of

two-dimensional filtered noise. Science, 180(4091), 1194-1197.

Hill, H., & Bruce, V. (1996). Effects of lighting on the perception of facial surfaces.

Journal of Experimental Psychology: Human Perception and Performance,

22(4), 986-1004.

Horn, B. K. P. (1970). Shape from shading: A method for obtaining the shape of a

smooth opaque object from one view. PhD thesis, MIT, 1970.

Horn, B. K. P. (1986). Robot vision. Cambridge, MA: The MIT press.

Horn, B. K. P., & Brooks, M. J. (1989). Shape from shading. Cambridge, MA: The MIT

press.

Johnston, A., Hill, H., & Carman, N. (1992). Recognising faces: Effects of lighting

25

direction, inversion, and brightness reversal. Perception, 21(3), 365–375.

Kemp, R., Pike, G., White, P., & Musselman, A. (1996). Perception and recognition of

normal and negative faces: The role of shape from shading and pigmentation

cues. Perception, 25(1), 37 – 52.

Lin, I. C., Yeh, J. S., & Ouhyoung, M. (2002). Extracting 3D facial animation

parameters from multiview video clips. Computer Graphics and Applications,

IEEE, 22(6), 72-80.

Lin, I. C., & Ouhyoung, M. (2005). Mirror MoCap: Automatic and efficient capture of

dense 3D facial motion parameters from video. The Visual Computer, 21(6),

355-372.

Liu, C. H., Collin, C. A., Burton, A. M., & Chaudhuri, A. (1999). Lighting direction

affects recognition of untextured faces in photographic positive and negative.

Vision Research, 39(24), 4003-4009.

Liu, C. H., Collin, C. A., Rainville, S. J., & Chaudhuri, A. (2000). The effects of spatial

frequency overlap on face recognition. Journal of Experimental Psychology.

Human Perception and Performance, 26(3), 956-79.

Liu, C. H., & Chaudhuri, A. (1997). Face recognition with multi-tone and two-tone

photographic negatives. Perception, 26(10), 1289-1296.

26

Mathworks Inc. (1993). MATLAB. Natick, MA: The MathWorks Inc.

Näsänen, R. (1999). Spatial frequency bandwidth used in the recognition of facial

images. Vision Research, 39(23), 3824-3833.

Parker, D. M., & Costen, N. P. (1999). One extreme or the other or perhaps the golden

mean? Issues of spatial resolution in face processing. Current Psychology, 18(1),

118-127.

Pelli, D. G. (1997). The VideoToolbox software for visual psychophysics: Transforming

numbers into movies. Spatial vision, 10(4), 437-442.

Ramachandran, V. S. (1988a). Perceiving shape from shading. Scientific American,

259(2), 76-83.

Ramachandran, V. S. (1988b). Perception of shape from shading. Nature, 331(6152),

163-166.

Rhodes, G. (1988). Looking at faces: First-order and second-order features as

determinants of facial appearance. Perception, 17(1), 43 – 63.

Rhodes, G., Geddes, K., Jeffery, L., Dziurawiec, S., & Clark, A. (2002). Are average and

symmetric faces attractive to infants? Discrimination and looking preferences.

Perception, 31(3), 315-322.

Rhodes, G., Peters, M., Lee, K., Morrone, M. C., & Burr, D. (2005). Higher-level

27

mechanisms detect facial symmetry. Proceedings of the Royal Society B:

Biological Sciences, 272(1570), 1379.

Ruiz-Soler, M., & Beltran, F. (2006). Face perception: An integrative review of the role

of spatial frequencies. Psychological Research, 70(4), 273-292.

Schyns, P. G., & Oliva, A. (1997). Flexible, diagnosticity-driven, rather than fixed,

perceptually determined scale selection in scene and face recognition.

Perception, 26(8), 1027 – 1038.

Schyns, P. G., & Oliva, A. (1999). Dr. Angry and Mr. Smile: When categorization

flexibly modifies the perception of faces in rapid visual presentations. Cognition,

69(3), 243-265.

Tyler, C. W., & McBride, B. (1997). The Morphonome image psychophysics software

and a calibrator for Macintosh systems. Spatial vision, 10(4), 479-484.

28

List of Figures

C.

A. B.

D.

E.

Figure 1. The symmetric algorithm helps to dissociate the contribution of the surface albedo from illumination component in image.

A, LA, is asymmetric component of illumination of face image. Through the process of shape-from-shaping, the LA provided 3D information about a face on a 2D image.

B, CS, is surface albedo. The face containing only CS looks very flat. In addition, it is next to impossible to identify the identity of the owner of the face with only the CS information. C, Original image. D, Residual information. E, Put the LA and the CS together, it is easy to see that the hybrid image and the original are pictures of the same person even though a lot of information (D) was thrown away.

29

Figure 2. Examples of stimuli.

A, We generated a series of intermediate facial information by morphing the symmetric components of high special frequency (symHSF) between the two faces. There were seven morphed images with proportion of femininity from 0 (male) to 1 (female). B, Examples of female asymLSF and symLSF images. There were four lighting directions: 0°, 15°, 30°, and 60°. The original female face was low-pass filtered to preserve coarse-scale shading information. C, The hybrid face stimuli was a combination of the female (or male, not shown) asymLSF of one lighting direction and on one of the morphed face. Here are examples with the 0.5 femininity symHSF face. The symHSF face looks very flat and it is hard to judge the gender. When combined with a female asymLSF face, say, the one with 60° lighting direction, the resulting hybrid face was frequently categorized as female by most observers.

30

Fraction of Female Responses observer CIT

symHSF

Fraction of FemaleResponses Observer CIT symHSF

Fraction of Female Responses Observer CAE

symHSF

Fraction of Female Responses Observer CAE symHSF

Fraction of Female Responses Observer CAI

symHSF

Fraction of Female Responses Observer CAI symHSF

31

PSE shift from symHSF condition (proportion of femininity)

Male asymLSF + symHSF symHSF only

Figure 3. The effect of asymLSF on face discrimination.

A, B, C, D, E, and F, Psychometric functions from three naïve observers. The fraction of femininity responses is plotted as a function of the proportion of femininity of symHSF.

The black psychometric curve is from symHSF only condition as a comparison. The red, green, blue, magenta psychometric curves are results of asymmetric facial information.

Under the female asymLSF condition, when lighting direction was shifted to more lateral, the observers saw female faces more frequently, and the psychometric curves shifted gradually to left. On the contrary, under the male asymLSF condition, the psychometric curve shifted dramatically to right, and psychometric curve was saturated under 30° and 60° illumination conditions, especially in B and F. These two observers tend to saw male faces in every level of femininity. G, Averaged PSE shift from symHSF only condition by asymmetric low spatial frequency facial information. The error bars represent one standard error. The p-value shown in each lighting direction was calculated from a two-tailed paired t test. The averaged PSE shift in male 30o and 60o illumination condition was beyond measurable range, so we used an arrow to plot.

32 Fraction of female response Observer CIT

symHSF

PSE shift from symHSF condition (proportion of femininity)

LSF information

Figure 4. Comparisons of the effect of asymLSF and symLSF on face discrimination.

A, B, C, Psychometric function from three naïve observers. The fraction of female responses is plotted as a function of the proportion of femininity. The black psychometric curve is from symHSF only condition in corresponding observers. The cyan psychometric curves are results of symLSF + symHSF in the female and male conditions. The magenta curves are results of asymLSF + symHSF faces. Results show that the symLSF had very little influence on face discrimination, because the psychometric curves were not significantly shifted. D, The average PSE shift relative to the symHSF only condition. The p-values denote the result of the paired t test. The averaged PSE shift in male 60o illumination condition was beyond measurable range, so we used an arrow to plot.

33

Figure 5. Results of depth judgment.

A, female condition. B, male condition. The depth rating value of the hybrid faces:

asymLSF (under four lighting directions) combined with symHSF, symLSF combined with symHSF (under 60o), and female and male symHSF. The depth value of the reference sphere was set to 3. Using least squares analysis, the linear regression lines of asymLSF condition of both genders show a positive increase in proportion to the lighting directions.

The slope parameters were significantly larger than zero (t(2) = 10.67, p= .0086 for the

34

Figure 6. Face discrimination performance is highly correlated with perceived depth.

A, female condition, Pearson correlation coefficient of depth rating value and averaged PSE shift, , and B, male condition . The averaged PSE shift in male 30o and 60o illumination condition was beyond measurable range, so we used arrows to plot. PSE shift from symHSFcondition (proportion of femininity)

Y= 0.37-0.167X

PSE Beyond measurable range

1 2 3 4 5

PSE shift from symHSF condition (proportion of femininity)

Depth rating

35

Fraction of Female Responses observer CIT

symHSF

Fraction of FemaleResponses Observer CIT symHSF

Fraction of Female Responses Observer CAE

symHSF

Fraction of Female Responses Observer CAE symHSF

Fraction of Female Responses Observer CAI

symHSF

Fraction of Female Responses Observer CAI symHSF

36 Fraction of Female Responses Observer CPY

symHSF Fraction of Female Responses Observer CPY

symHSF Fraction of Female Responses Observer CBN

symHSF Fraction of Female Responses Observer CBN

symHSF Fraction of Female Responses Observer LNL

symHSF Fraction of Female Responses Observer LNL

symHSF 0 ° 15 ° 30 ° 60 °

Figure 7. All seven observers’ asymLSF condition data.

The fraction of femininity responses is plotted as a function of the proportion of femininity.

37 Fraction of female response Observer CIT

symHSF Fraction of female response Observer CPY

symHSF Fraction of female response Observer CBN

F Fraction of female response Observer LNL

symHSF F_asymLSF + Morphs M_asymLSF + Morphs F_symLSF + Morphs M_symLSF + Morphs

38

G

0 0.2 0.4 0.6 0.8 1.0

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

LSF information

Proportion of femininity Fraction of female response Observer RYT

symHSF F_asymLSF + Morphs M_asymLSF + Morphs F_symLSF + Morphs M_symLSF + Morphs

Figure 8. All seven observers’ symLSF condition data (compared with asymLSF data).

The fraction of femininity responses is plotted as a function of the proportion of femininity.

相關文件