• 沒有找到結果。

Experiments over Real Scenes

To verify the effectiveness of our dynamic calibration algorithm, we performed the following experiments over real scenes. In the first experiment, test images were captured by four cameras mounted on the ceiling. These four cameras kept panning and tilting while capturing images. In total, each camera captured 1000 test images, with the resolution of 320 by 240 pixels. Besides, in order to evaluate the calibration results, we placed test landmarks in the scene with a 100-frame interval. That is, we capture 100 image frames; stop and place some landmarks in the scene; capture an image with the presence of landmarks; stop and remove these landmarks; and then resume image capturing for another 100 frames. This procedure was repeated till we captured all 1000 images for every camera. Figure 4.11(a) shows an example of captured images by these four cameras. In comparison, Fig. 4.11(b) shows the same images but with the presence of landmarks.

At the beginning of the experiment, the static calibration introduced in Chapter 3 was applied to calibrate the initial setup of these 4 cameras. The static calibration results are listed in Table 4.3. The left part of Table 4.3 lists for each camera the estimated tilt angle and its altitude above the brown table in the scene. The right part of Table 4.3 lists for each camera the estimated position and orientation with respect to Camera-2. To evaluate the static calibration result, we use the landmarks in the image captured by Camera-2 to infer the corresponding points on the other three images. The results are shown in Fig. 4.12. It can be seen that the correspondences are reasonably accurate. In addition, we also calculated the 3-D coordinates of these landmarks and used them as a ground truth for the evaluation of our dynamic calibration algorithm.

(a)

(b)

Fig. 4.11 (a) Test images captured by four cameras. (b) Test images with the presence of landmarks. The images captured by Camera-1, Camera-2, Camera-3, and Camera-4 are arranged in the left-to-right, top-to-bottom order.

Table 4.3 Results of the Static Calibration.

Fig. 4.12 Evaluation of initial calibration.

As cameras began to pan and tilt, we extracted 50 prominent feature points from each of these four initial images and tracked these feature points by the KLT method.

Based on (4.9) and (4.10), we performed dynamic calibration for every image pair. In our experiment, we calibrated six camera pairs {Camera-k, Camera-k’}, with {k, k’}

∈ {{2, 1}; {2, 3}; {2, 4}; {4, 1}; {4, 3}; {1, 3}} and averaged the calibration results for each camera.

To evaluate the results of dynamic calibration, we performed static calibration at the period of every 100 frames, based on these images with the presence of landmarks.

The result was verified by projecting the aforementioned 3-D landmarks onto the image plane of each camera. Figure 4.13 shows the differences of the estimated pan angles and tilt angles between the dynamic calibration results and the static calibration results. Note that the static calibration results are performed based on the 3-D landmarks that have been well calibrated at the beginning of the experiment. In Fig.

4.13, it shows that the differences gradually increase when the frame number increases. However, the deviation at the 1000th frame is still acceptable and is within the range of ±3 degrees. Moreover, based on the results of dynamic calibration, we may also directly pick up a few landmark points in the image captured by Camera-2 and project them onto the other three images, as shown in Fig. 4.14.

Furthermore, if we fixed one of the four cameras while let the other three cameras pan and tilt freely, it turns out the results of dynamic calibration become even more reliable, as shown in Fig. 4.15. Now the differences of pan angles and tilt angles are within the range of ±1.5 degrees. Besides, the differences do not gradually increase this time.

(a)

(b)

Fig. 4.13 (a) Differences of the pan angles between the dynamic calibration results and the static calibration results. (b) Differences of the tilt angles between the dynamic calibration results and the static calibration results.

(a)

(b)

(c)

Fig. 4.14 Evaluations of dynamical calibration at (a) the 300th frame, (b) the 600th frame, and (c) the 1000th frame.

(a)

(b)

Fig. 4.15 (a) Differences of the pan angles and (b) differences of the tilt angles between the dynamic calibration results and the static calibration results, with one of the cameras being fixed all of the time.

We also test the situation when a moving object is present during the dynamic calibration process. Limited by our camera control system, we cannot simultaneously control four cameras in real time. Hence, we only allow two cameras to pan and tilt in this experiment. Again, we captured 1000 frames for each camera and Fig. 4.16 shows a sample of the captured sequence. In Fig. 4.17, we show the corresponding relationship of the 1000th frame based on our dynamic calibration result. This reasonable correspondence demonstrates the effectiveness and feasibility of our dynamic calibration algorithm

Fig. 4.16 One sample of the test sequence with the presence of a moving person.

Fig. 4.17 Evaluated corresponding relationship of the 1000th frame in the test sequence with a moving person.

CHAPTER 5

Conclusions

______________________________________________

In this dissertation, we present two new and efficient pose calibration techniques: 1) the static calibration for multiple cameras based on the back-projections of simple objects lying on the same plane, and 2) the dynamic calibration for multiple cameras with no complicated point correspondence technique.

In the problem of static calibration for multiple cameras, we infer the relative positioning and orientation among multiple cameras. We first deduced the 3D-to-2D coordinate transformation in terms of the tilt angle of a camera. After having established the 3D-to-2D transformation, the tilt angle and altitude of each camera are estimated based on the observation of some simple objects lying on a horizontal plane.

With the estimated tilt angles and altitudes, the relative orientations among multiple cameras can be easily obtained by comparing the back-projected world coordinates of some common vectors in the 3-D space. If compared with these conventional calibration approaches which extract the homography matrix and the rotation matrix, our approach offers apprehensible geometric sense and can simplify the calibration

process. No coordinated calibration pattern is needed and the computational load is light. In this dissertation, the sensitivity analysis with respect to parameter fluctuations and measurement errors is also discussed. Both mathematical analysis and computer simulation results are shown to verify our analysis. Experiment results over real images have demonstrated the efficiency and feasibility of this approach.

In the problem of dynamic calibration for multiple cameras, we added the pan angle factor on the mapping between a horizontal plane in the 3-D space and the 2-D image plane on a panned and tilted camera. Based on the mapping, we utilize the displacement of feature points and the epipolar-plane constraint among multiple cameras to infer the changes of pan and tilt angles for each camera. This algorithm does not require a complicated correspondence of feature points. It also allows the presence of moving objects in the captured scenes while performing dynamic calibration. This kind of dynamic calibration process can be very useful for applications related to active video surveillance. The sensitivity analysis of our dynamic calibration algorithm with respect to measurement errors and fluctuations in previous estimations is also discussed mathematically. From the simulation results, the estimation errors of pan and tilt angle changes are proved to be acceptable in real cases. The efficiency and feasibility of this approach has been demonstrated in some experiments over real scenery.

In this dissertation, we adopt a system model that is general enough to fit for a large class of surveillance systems with multiple cameras. Both our static and dynamic calibration methods do not require particular system setup or specific calibration patterns. In some sense, our static calibration can be thought to have decomposed the computation of homography matrix into two simple calibration processes so that the computational load becomes lighter for the calibration of multiple cameras. In addition, the major advantage of our dynamic calibration is that

no complicated correspondence of feature points is needed. Hence, our calibration methods can be well applied to a wide-range surveillance system with multiple cameras. However, in this thesis, we do not combine our calibration algorithm into the related applications of surveillance systems with multiple cameras. The calibration results would offer useful three-dimensional information for surveillance applications, such as object tracking or 3-D positioning. Besides, the zooming effect is not discussed in this dissertation. It should be worthwhile to study these two topics in the future.

Bibliography

______________________________________________

[1] R. Y. Tsai, “A Versatile Camera Calibration Technique for High-Accuracy 3D Machine Vision Metrology Using Off-the-Shelf TV Cameras and Lenses,” IEEE Journal of Robotics and Automation, vol. 3, issue 4, pp. 323–344, Aug. 1987.

[2] O. Faugeras, Three-Dimensional Computer Vision, The MIT Press Cambridge, Massachusetts London, England, 1993, pp.230–240.

[3] David A. Forsyth and Jean Ponce, Computer Vision: A Modern Approach, Prentice Hall, 2003.

[4] Zhengyou Zhang, “Flexible Camera Calibration by Viewing a Plane from Unknown Orientations,” The Proceedings of the Seventh IEEE International Conference on Computer Vision, vol. 1, pp. 666–673, Sept. 1999.

[5] Guo-Qing Wei and Song De Ma, “A Complete Two-Plane Camera Calibration Method and Experimental Comparisons,” In Proc. Fourth International Conference on Computer Vision, pp. 439–446, May 1993.

[6] D. Oberkampf, D. F. DeMenthon and L. S. Davis, “Iterative Pose Estimation Using Coplanar Points,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 626-627, June 1993.

[7] Peter F. Sturm and Stephen J. Maybank, “On Plane-Based Camera Calibration:

A General Algorithm, Singularities, Applications,” IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 1, pp. 432-437, June 1999.

[8] T. Kanade, P.J. Narayanan, and P.W Rander, “Virtualized Reality: Constructing Virtual Worlds from real scenes,” IEEE Multimedia, vol. 4, issue 1, pp. 34-47, May, 1997.

[9] Charles Wiles and Allan Davison, “Calibrating a Multi-Camera System for 3D Modelling,” In Proceedings. IEEE Workshop on Multi-View Modeling and Analysis of Visual Scenes, pp. 29–36, June 1999.

[10] Peter Sturm, “Algorithms for Plane-Based Pose Estimation,” In Proceedings.

IEEE Conference on Computer Vision and Pattern Recognition, vol. 1, pp.

706-711, June 2000.

[11] T. Ueshiba and F. Tomita, “Plane-Based Calibration Algorithm for Multi-Camera Systems via Factorization of Homography Matrices,”

Proceedings of Ninth IEEE International Conference on Computer Vision, vol. 2, pp. 966–973, Oct. 2003.

[12] Zhengyou Zhang, “Camera Calibration with One-Dimensional Objects,” IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 26, issue 7, pp.

892–899, July 2004.

[13] P. Hammarstedt, P. Sturm and A. Heyden, “Degenerate cases and closed-form solutions for camera calibration with one-dimensional objects,” Tenth IEEE International Conference on Computer Vision, vol. 1, pp. 317-324, Oct. 2005.

[14] S. J. Maybank and O. D. Faugeras, “A Theory of a Moving Camera,” Int.

Journal of Computer Vision, 8(2), pp. 123-151, Aug. 1992.

[15] R. Hartley, “Eucldean Reconstruction from Uncalibrated Views,” Int. Journal of Applications of Invariance in Computer Vision, Springer-Verlag, pp. 237-256, 1994.

[16] A. Heyden and K. Astrom, “Euclidean Reconstruction from Constant Intrinsic Parameters,” Proc. of the 13th International Conference on Pattern Recognition, pp. 339-343, 1996.

[17] M. Pollefeys, L. V. Gool and A. Oosterlinck, “The Modulus Constraint: A new Constraint for Self-Calibration,” Proc. of the 13th International Conference on Pattern Recognition, pp. 349-353, 1996.

[18] Q. T. Luong and O. D. Faugeras, “Self-calibration of a moving camera from point correspondences and fundamental matrices,” Int. Journal of Computer Vision, 22(3), pp. 261-289, 1997.

[19] M. Pollefeys and L. V. Gool, “Self-Calibration from the Absolute Conic on the Plane at Infinity,” Proc. Computer Analysis of Image and Patterns, vol. 1396, Springer-Verlag, pp. 175-182, 1997.

[20] B. Triggs, “Autocalibration and the Absolute Quadric,” Proc. IEEE Conference on Computer Vision and Pattern Recognition, pp. 609-614, 1997.

[21] M. A. Lourakis and R. Deriche, “Camera Self-calibration Using the Singular Value Decomposition of the Fundamental Matrix,” In the 4th Asian Conference on Computer Vision, vol. I, pp. 403-408, Jan. 2000.

[22] R. Hartley, “Self-calibration from Multiple Views with a Rotating Camera,”

Lecture Notes in Computer Science, vol. 800-801, Springer-Verlag, pp. 471-478, 1994.

[23] T. Moons, L. V. Gool, M. Proesmans and E. Pauwels, “Affine Reconstruction from Perspective Image Pairs with a Relative Object-camera Translation in

between,” IEEE Trans. PAMI, vol. 18, no. 1, pp. 77-83, Jan. 1996.

[24] M. Armstrong, A. Zisserman and R. Hartley, “Euclidean Reconstruction from Image Triplets,” ECCV, vol. 1064, Springer-Verlag, pp. 3-16, 1996.

[25] O. Faugeras, L. Quan and P. Sturm, “Self-calibration of a 1-D Projective Camera and Its Application to the Self-calibration of a 2-D Projective Camera,” IEEE Trans. PAMI, 22(10), pp. 1179-1185, Oct. 2001.

[26] L. Wang, S.B. Kang, H. Y. Shum, and G. Xu, “Error Analysis of Pure Rotation-based Self-calibration,” Proc. 8th Int. conf. on Computer Vision, pp. I:

464-471, July 2001.

[27] M. Pollefeys, L. V. Gool and M. Proesmans, “Euclidean 3D Reconstruction from Image Sequences with Variable Focal Lengths,” ECCV, vol. 1064, pp.

31-42, 1996.

[28] B. Rousso and E. Shilat, “Varying Focal Length Self-calibration and Pose Estimation from Two Images,” Proc. of IEEE Conf. on Computer Vision and Pattern Recognition, pp. 469-475, June 1998.

[29] M. Pollefeys, R. Koch, and L. V. Gool, “Self-calibration and Metric Reconstruction in spite of Varying and Unknown Intrinsic Camera Parameters,”

Int. Journal of Computer Vision, 32(1), pp. 7-25, Aug. 1999.

[30] F. Kahl, B, Triggs and K. Astrom, “Critical Motions for Auto-calibration When Some Intrinsic Parameters can Vary,” Journal of Mathematical Image and Vision, 13(2), pp. 131-146, Oct. 2000.

[31] L. De Agapito, R. Hartley and E. Hayman, “Linear Calibration of a Rotating and Zooming Camera,” Proc. IEEE CVPR, pp. 15-21, 1999.

[32] Y. Seo and K. Hong, “About Self-calibration of a Rotating and Zooming Camera:

Theory and Practice,” In Proc. 7th Int. Conf. on Computer Vision, vol. 1, pp.

183-189, Sept. 1999.

[33] B. Tordoff and D. W. Murray, “Violating Rotating Camera Geometry: the Effect of Radial Distortion on Self-calibration,” Proc. of Int. Conf. on Pattern Recognition, pp. 1423-1427, 2000.

[34] L. De Agapito, E. Hayman and I. Reid, “Self-calibration of a Rotating and Zooming Camera,” Int. Journal of Computer Vision, 45(2), pp. 107-127, 2001.

[35] H. Kim and K. S. Hong, “A Practical Self-calibration Method of Rotating and Zooming Cameras,” IEEE Proc. on Vision, Image and Signal, vol. 148(5), pp.

349-355, Oct. 2001.

[36] Elsayed E. Hemayed, “A Survey of Camera Self-Calibration”, In Proc. IEEE Conference on Advanced Video and Signal Based Surveillance, pp. 351–357, July 2003.

[37] E. Kruppa, “Zur Ermittlung eines Objektes aus zwei Perspektiven mit innerer

Orientierung,” Sitz.-Ber. Akad. Wiss., Wien, math. Naturw. KL., Abt. IIa., 122:

1939-1948, 1913.

[38] A. Jain, D. Kopell, K. Kakligian, and Y-F Wang, “Using Stationary-Dynamic Camera Assemblies for Wide-Area Video Surveillance and Selective Attention,”

IEEE Conf. Computer Vision and Pattern Recognition, vol. 1, pp. 537–544, June 2006.

[39] G. Schweighofer and A. Pinz, “Robust Pose Estimation from a Planar Target,”

IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 28, no.

12, pp. 2024-2030, December 2006.

[40] K.-T. Song and J.-C. Tai, “Dynamic Calibration of Pan–Tilt–Zoom Cameras for Traffic Monitoring,” IEEE Transactions on Systems, Man and Cybernetics, Part B, vol. 36, issue 5, pp. 1091–1103, Oct. 2006.

[41] Y. Li, F. Zhu, Y. Ai and F.-Y. Wang, “On Automatic and Dynamic Camera Calibration Based on Traffic Visual Surveillance,” IEEE Symposium on Intelligent Vehicles, pp. 358-363, June 2007.

[42] C.T. Huang and O.R. Mitchell, “Dynamic Camera Calibration,” in Proc. Int.

Symposium on Computer Vision, pp. 169–174, Nov. 1995.

[43] B. Zhang, Y. F. Li and Y. H. Wu, “Self-recalibration of a Structured Light System via Plane-Based Homography,”Pattern Recognition, vol. 40, Issue 4, pp.

1368-1377, April 2007.

[44] R. Hartley and A. Zisserman, Multiple View Geometry in Computer Vision, Cambridge, 2nd edition, 2003.

[45] J. Shi and C. Tomasi, “Good Features to Track,” In Proc. IEEE Conf. Computer Vision and Pattern Recognition, pp. 593 – 600, June 1994.

[46] Ser-Nam Lim, L.S. Davis, and Ahmed Elgammal, “A Scalable Image-Based Multi-Camera Visual Surveillance System”, In Proc. IEEE Conference on Advanced Video and Signal Based Surveillance, pp. 205–212, July 2003.

Curriculum Vitae

______________________________________________

Name:

I-Hsien Chen (陳宜賢) Born on April 27, 1978

Education:

Sep. 2001 ~ Jan. 2008 Ph. D Student, Institute of Electronics,

National Chiao-Tung University, Hsin-chu, Taiwan Sep. 2000 ~ June 2001 M.S. Student, Institute of Electronics,

National Chiao-Tung University, Hsin-chu, Taiwan Sep. 1996 ~ June 2000 B.S. Student, Institute of Electronics,

National Chiao-Tung University, Hsin-chu, Taiwan

Work Experience:

Aug. 2004 ~ Jan. 2005 Lecturer in Yuanpei University (元培科學技術大學)

Publications:

Dissertation:

Ph. D Dissertation: Static and Dynamic Calibration of Multiple Cameras (多台攝

影機之靜態與動態校正技術)

Journal Paper:

[1] I-H Chen and S-J Wang, “An efficient approach for the calibration of multiple PTZ cameras,” IEEE Transactions on Automation Science and Engineering, vol. 4, no. 2, pp. 286 – 293, April 2007.

[2] I-H Chen and S-J Wang, “An efficient approach for dynamic calibration of multiple cameras,” IEEE Transactions on Automation Science and Engineering, to be published.

Conference Papers:

[1] I-H Chen and S-J Wang, “Efficient vision-based calibration for visual surveillance systems with multiple PTZ cameras,” IEEE Conference on Computer Vision Systems, pp.24-24. Jan. 2006

[2] I-H Chen and S-J Wang, “A vision-based approach to extracting the tilt angle and altitude of a PTZ camera,” Proceedings of the SPIE, vol. 6066, pp.606609-1 – 9, Jan. 2006.

Filed U.S. Patents:

[1] I-H Chen and S-J Wang, “Calibration System for Image Capture Apparatus and Method Thereof”, Filing No.: 11/483,542, Filing Date: July 11, 2006.

Filed Taiwan Patents:

[1] 陳宜賢、王聖智,“影像擷取裝置之校正系統及其方法",申請案號:

095105830,申請日:95.02.21.

相關文件