• 沒有找到結果。

The image registration of multi-band images by geometrical optics

N/A
N/A
Protected

Academic year: 2021

Share "The image registration of multi-band images by geometrical optics"

Copied!
7
0
0

加載中.... (立即查看全文)

全文

(1)

The image registration of multi-band images by geometrical optics

Yung-Jhe Yan*

a

, Hou-Chi Chiang

a

, Yu-Hsiang Tsai

c

, Ting-Wei Huang

b

, Mang Ou-Yang

b

a

Inst. of Electrical Control Engineering, National Chiao-Tung Univ., Hsinchu City, Taiwan;

b

Dept. of Electrical and Computer Engineering, National Chiao-Tung Univ., Hsinchu City, Taiwan;

c

Industrial Technology Research Inst., Taiwan;

ABSTRACT

The image fusion is combination of two or more images into one image. The fusion of multi-band spectral images has been in many applications, such as thermal system, remote sensing, medical treatment, etc. Images are taken with the different imaging sensors. If the sensors take images through the different optical paths in the same time, it will be in the different positions. The task of the image registration will be more difficult. Because the images are in the different field of views (F.O.V.), the different resolutions and the different view angles. It is important to build the relationship of the viewpoints in one image to the other image.

In this paper, we focus on the problem of image registration for two non-pinhole sensors. The affine transformation between the 2-D image and the 3-D real world can be derived from the geometrical optics of the sensors. In the other word, the geometrical affine transformation function of two images are derived from the intrinsic and extrinsic parameters of two sensors. According to the affine transformation function, the overlap of the F.O.V. in two images can be calculated and resample two images in the same resolution. Finally, we construct the image registration model by the mapping function. It merges images for different imaging sensors. And, imaging sensors absorb different wavebands of electromagnetic spectrum at the different position in the same time.

Keywords: multi-band image, image registration, image fusion

1. INTRODUCTION

Blending different images into a single image has been utilized in many areas1, such as thermal image system2, 3, 4,

remote sensing, satellite imagery, medical treatment. The images which belong to the different wavebands are combined into one image, called multi-band images. The multi-band images provide more individual and relevant information. When information is fused into a single image, which will be more informative for the observers. As well known, the visible image is used to combine with other spectral images. For instance, the thermal image system can view the surface temperature distribution of the object. In order to make the observer clearly recognize temperature at the different position of the object, it combines a thermal image and a visible image into a single image. For more information, the review of image fusion technology in the last few years is clearly introduced in the study proposed by Smith, M.I.1.

The multi-band spectral images are taken from the different sensors. Especially, in the real-time multi-sensor imaging systems, it needs to take images from the different sensors in the same time and combine the images into a single image. If the optical path of these sensors are different, they will be in the different positions. It will comes into some problems. Because the sensors have the different intrinsic and extrinsic parameters, such as focal length, pixel size, object distance, the F.O.V. of the sensors are different, shown as Fig. 1(a). Also, the overlap of the views for the sensors will differ from the object distance, shown as Fig. 1(b).

The method of image registration in many studies5-12, constructs the mapping function from one image into the other

image. The mapping function defines the parameters about rotation, shift, scales, etc. In order to acquire these parameters, it needs to find out at least two pairs of points for two images. The pairs of points are the same point in the real world related to both images. Then, two pairs of points are taken into the function to derive the parameters. The most important issue is how to decide these pairs of points. Some studies use feature detection method and Hausdorff Distance theory9, 10, 11 to decide these points. In addition to this method, other methods are based on area-based or

feature-based for the image registration and the image fusion in studies5-8.

Because the feature of images always varied with the different waveband sensors, above methods are unstable. It is hard to define the same features in both images. Furthermore, the feature needs to be calculated repeatedly when the images

(2)

Target plane "x d Infrar-d imag Visible im rr or d r overlapping dvr

Infrared image sensor

ïal overlap

N.

Infrared image plane Infrared frame plane

F

Yt,n2 x1m2Visible frame plane Lens center

Target plane

Lens center

Visible image plane

are dynamic. Therefore, it is not stable to use the image processing methods in the real-time multi-sensor image systems. This paper creates the image registration model by geometrical optics. It decides the mapping function from one image into the other image. The model only considers two imaging sensors and the optical path of two sensors are assumed in the condition of parallel with each other. Through the using of the extrinsic and the intrinsic parameters of two sensors, this model can find out the overlap of F.O.V. for two sensors and decides the mapping function. In the end, the task of the image registration could quickly done by entering the object distance.

(a) (b)

Figure 1. (a) The F.O.V of two sensors in the target plane; (b) Overlap of F.O.V for two sensors at different object distance D.

2. METHOD

In this image registration model, the optical path of two sensors are defined to parallel with each other. The optical system of both sensors are considered as Gaussian optics with the ideal thin lens. Because the experiments are based on the visible and the infrared images, the multi-band spectral images will be mainly focused on these kinds of images. This section is mainly divided in the four part. In the first part, the coordinate systems and the important parameters will be defined. In the second part, it analyzes the overlap of F.O.V. in the different object distance for the visible and the infrared sensors. The third part illustrates the transformation and the relevance between the different coordinates. In the final part, it will derive the mapping function from the coordinate transformations in the third part.

2.1 Coordinate systems and Parameters

In the first, it needs to define the real world coordinate, which is defined as 2-D target plane with distance D to the lens center of two sensors. In the second, it needs to define the image plane to express the projection image from the target plane to the imaging sensors. The sensors receive the data from the projection image and transform it into the digital image. Therefore, the frame plane is defined to express the digital image which is derived from the sensors. In Fig. 2, the left side is the coordinate (X, Y) which expresses the coordinate of target plane. Beside the lens optical centers are the coordinate (xim, yim) which express the coordinate of image plane. In the right side of the Figure 2, the coordinate (xfm, yfm) express the coordinate of the frame plane. The subscript of the number “1” express the plane or the parameter belong to the visible sensor. Vice versa, the subscript of the number “2” express the plane or the parameter belong to the infrared sensor.

(3)

M MM

If# 't<L[Li

L

Before starting to derive the transformation between the different coordinates, it needs to define the intrinsic and the extrinsic parameters. In this image registration model, the distance from the target or the object to both sensors are the same. It is expressed as an uppercase letter “D”. The focal length of the visible and the infrared imaging sensors are f1 and f2. These two parameters are mainly used in the transformation function from the target plane to the image plane. The transformation between the frame plane and the image plane is determined by the pitch size and the pixel numbers of two sensors. The inverse of pitch size of two sensors are expressed as S1 and S2. The height and the width in the pixel numbers of two sensors are expressed as Cx1, Cy1, Cx2 and Cy2. Also, the registration of two images are based on the related position of two sensors. It needs to measure the distance between lens centers of two sensors which are expressed as dvi. In section 2.2, it will use the view angles to analysis the overlap situation in different D. So, the view angles in different directions of two sensors are express as θx1,θy1, θx2, θy2.

2.2 Overlap of F.O.V analysis

In Fig. 1(b), because the visible imaging sensor align with the infrared imaging sensor horizontally, the situation of the overlap of F.O.V. for two sensors will be only considered in horizontal direction. When the D is increasing to longer than Zin, the F.O.V. for two sensors will start to overlap. Because two sensors are closed. If D is longer than Zex and view angles of the visible imaging sensor is bigger than the infrared imaging sensor, the F.O.V. of the visible imaging sensor will cover the F.O.V. of the infrared imaging sensor completely. According to above, the situations of horizontal overlap of F.O.V. for two sensors could be divided into non-overlap, partial-overlap and completely-overlap. Also, Zin and Zex could be derived from θy1, θy2 and dVI. In Fig. 3(a), Zin, θx1, θx2 and dVI has the relationship in eq. (1).

tan

tan

2

2

y1 y2

in in VI

Z

θ

+

Z

θ

=

d

(1)

According to eq. (1), Zin could be derived from eq. (2).

tan

tan

2

2

y1 y2 in VI

Z

=

d

θ

+

θ

(2)

In Fig. 3(b), Zex, θy1, θy2 and dVI has the relationship in eq. (3).

tan

tan

2

2

y1 y2 ex ex VI

Z

θ

Z

θ

=

d

(3) Then, Zex could be derived from eq. (3).

tan

tan

2

2

y1 y2 ex VI

Z

=

d

θ

θ

(4) (a) (b)

(4)

Xlm image plane L Target plena , X rti -.. Image plane F D

After figuring out Zin and Zex, it is clearly know what the situations of horizontal overlap of F.O.V. for two sensors at different target distance.

2.3 Coordinate transformation

In this section, the transformation functions between the different coordinates will be introduced. One is the transformation from the target plane to the image plane, the other is the transformation from the image plane to the frame plane. As mentioned before, the optical system of two sensors are considered as Gaussian optics with the ideal thin lens. Therefore, target distance D, the distance d from sensors to its lens center, shown in Fig. 1(a), and the focal length f formed the simple equation.

f

d

D

1

1

1

+

=

(5) In Fig 4(a), the object’s length is X at x-axis on the target plane. It projects onto the image plane become as x. X are

related to x and D are related to d. They have the same ratio in the corresponding side of similar triangles in eq. (6).

-x

d

X

=

D

(6)

According to eq. (5), the distance d in eq. (5) can be replaced by x, X and D. Finally, the transformation function from the target plane to the image plane is given in eq. (7). And, the “D/f -1” is expressed as K.

=

− −

Y

X

y

x

K K im im 1 1

0

0

(7) The other important function is transformation between the image plane and the frame plane. The sensors detect the

images and constitute the digital image. The distance in the image plane which is related to the one pixel in frame plane are defined by parameter S. The unit of S is millimeter per pixel. Besides the ratio, because the origins of two coordinates are different, it needs to consider the pixel numbers in the frame plane, shown in Fig. 4(a). In the end, the transformation function is described in eq. (8).

2 2

0

0

x y C fm x im C fm y im

x

S

x

y

S

y

⎤ ⎡

=

+

⎥ ⎢

⎦ ⎣

⎦ ⎣

(8) (a) (b)

Figure 4. (a) The object projects from the target plane to the image plane; (b) The coordinate transformation between image plane and target plane.

2.4 Mapping function

Once the transformation function between the different coordinates was derived, it also can derive the mapping function from the infrared sensor to the visible sensor according to above equations. In Fig. 5, the transformation from the infrared frame plane to the visible frame plane can be divided into five steps. In the first, eq. (9) and eq. (10) are the consequence of the step I to the step II and the step IIII to the step IV which can be derived from eq. (7) and eq. (8).

(5)

Target plane Target plane larget plane

Infrared image sensor

1 00m

asible image sensor

500 mm 0 0 0 0 2 2 x2 x2 x2 2 2 y2 y2 y2 K K C S fm2 S 2 2 K K C S fm2 S 2 2 x X y Y − − ⎡ ⎤⎡ ⎤ ⎡ ⎤⎡ ⎤ ⎡ ⎤= + ⎢ ⎥⎢ ⎥ ⎢ ⎥⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎦⎣ ⎦ ⎣ ⎦⎣ ⎦ (9) x1 x2 1 y1 y2 1 -S C K fm1 fm2 2 -S C K fm1 fm2 2 x 0 x = + y 0 y ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎦ ⎣ ⎦⎣ ⎦ (10)

The step III is the transformation from the infrared target plane to the visible target plane. Because two sensors align in y direction, the transformation only shifts distance dVI in y direction, shown in eq. (11).

1 2 VI 1 2 0 X X = + -d Y Y ⎡ ⎤ ⎡ ⎤ ⎡ ⎤ ⎢ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎦ ⎣ ⎦ (11)

The mapping function is the transformation from the infrared frame plane to the visible frame plane. According to the eq. (9 、10、 11), the final transformation function is eq. (12).

x 2 x1 y 2 y 1 x 1 2 C C 1 x 2 2 fm 1 fm 2 2 C C y 1 y 1 2 fm 1 2 fm 2 2 V I 1 1 y 2 S K 0 0 K S x x -S = + + S y K y - d 0 K K S ⎡ ⎤ ⎡ ⎤ ⎢ ⎥ ⎡ ⎡ ⎤ ⎥ ⎢ ⎡ ⎤ ⎥ ⎢ ⎥ ⎢ ⎥ ⎣ ⎦ ⎣ ⎦ ⎣ ⎦ ⎣ ⎦ ⎢ ⎥ ⎣ ⎦ (12)

The mapping function can wrap the infrared image to the visible image in the different distance D. Also, the other problem is the different resolutions of two images. If resolution of two images are different, the mapping of two images aren’t one by one. It needs to adjust the resolution of two images in order to make the mapping of two images are nearly one by one.

Figure 5. The order of mapping frame plane from an infrared sensor to a visible sensor

3. EXPERIMENTAL RESULTS

In this experiment, we take the visible and the infrared images from three target distance, 500mm, 1000mm and 1500mm, shown in fig. 6. And Fig. 7, 8, 9 are the visible image and the infrared image data. The infrared image is drew according to the temperature data for each pixels. Specially, because the infrared images just present the information of temperature, the target is heated in order to have obviously variety of temperature relative to the background.

(6)

1111 1 1 11 1 1...

Im

1

l'

' 1 I 1-. LI im 1

ll'all"

.

1 1.1 1 1

ii

;11%°41111'1'1' 1" 11'11'1 1 11'1 11 1

11 Il

1111111 1 11 1 11 11 I 111 Ill IllI 1 ITI1111 11' 11111111- 11 16 11111"1

114 ' 1.15110 IL

C

ii .19

ir.99.!-iiii la ...III

..

....I..,

.

I. ,

...

.. 0 ...

. .. .. ..

.1 II

1 .i... ..1.. ... ...1

...

.

....

ill 1 1 I Illm 1111

-i

Figure 7. The Visible image and the infrared image at 500mm.

Figure 8. The Visible image and the infrared image at 1000mm.

Figure 9. The Visible image and the infrared image at 1500mm.

The visible and the infrared image registration results of the experiment at three target distance are showed in Fig. 10.

(a) (b) (c) Figure 10. Image registration results at different distance, (a) at 500mm, (b) at 1000mm, and (c) at 1500mm.

(7)

4. CONCLUSIONS

According to geometrical optics of the sensor systems, the image registration model can easily just determined by the intrinsic and the extrinsic parameters. Because the geometrical affine function of two images are predefined and it is the linear function for the condition of parallel optical axis in two sensors, the relationship of the viewpoints in one image to the other image can be immediately calculated by entering the distance. And distance is from two camera sensors to the target. If we can get the distance, it can provide the image registration in real-time processing for multi-band image systems.

In our experiment, the image registration model is proved to merge two images. But it still have some errors in calculating the position of the infrared images in the visible image, shown in Fig. 10(b) and (c). One important issue is how to design the exactly experiment for this model. Including the problem of ensuring the parallel of optical paths for two sensors and the problem of the exactly intrinsic or extrinsic parameters, which are used in the model. In the future, the more precise experiment will be designed to prove this model and the more factors could be considered into this model, such as the situation of un-completely parallel of optical path for the sensors.

ACKNOWLEDGMENTS

This paper was particularly supported by the Aim for the Top University Program of the National Chiao Tung University, Biomedical Electronics Translational Research Center of National Chiao-Tung University, Ministry of Education of Taiwan (Contract No. HCH102-39), China Medical University, the National Science Council of Taiwan (Contract No. MOST 103-2221-E-009 -190), Industrial Technology Research Institute of Taiwan, Radiant Innovation Inc. of Taiwan.

REFERENCES

[1] Smith, M. I. and Heather, J. P., "Review of image fusion technology in 2005," Defense and Security. International Society for Optics and Photonics, 29-45 (2005).

[2] Bulanon, D. M., Burks, T. F. and Alchanatis, V., "Image fusion of visible and thermal images for fruit detection," Biosystems Engineering. Papers 103(1), 12–22 (2009).

[3] Dwyer, D., Smith, M. I., Dale, J. L. and Heather, J. P., "Real time implementation of image alignment and fusion," Defense and Security. International Society for Optics and Photonics, 16-24 (2005).

[4] Bartys, M., Putz, B., Antoniewicz, A. and Zbrzezny, L., "Real-Time Single FPGA-Based Multimodal Image Fusion System," In Imaging Systems and Techniques (IST). IEEE International Conference, 460–465 (2012).

[5] Mount, D. M., Netanyahu, N. S. and Moigne, J. L., "Efficient Algorithms for Robust Feature Matching," Pattern recognition. Papers 32(1), 17-38 (1999).

[6] Di Stefano, L., Marchionni, M., Mattoccia, S. and Neri, G., "A fast area-based stereo matching algorithm," Image and vision computing. Papers 22(12), 983-1005 (2004).

[7] Salvi, J., Matabosch, C., Fofi, D. and Forest, J., "A Review of recent range image registration methods with accuracy evaluation," Image and Vision computing. Papers 25(5), 578–596 (2007).

[8] Zitová, B., Flusser, J., "Image registration methods: a survey," Image and vision computing. Papers 21(11), 977-1000 (2003).

[9] Hrkac, T., Kalafatic, Z. and Krapac, J., "Infrared-visual image registration based on corners and Hausdorff distance," Image Analysis, Springer Berlin Heidelberg, 383–392 (2007).

[10] Niu Li-Pi, Mao Shi-Yi, Chen Wei, "Image Registration Based on Hausdorff Distance," Networking and Information Technology, International Conference, 35-38 (2007).

[11] Huttenlocher, D. P., Klanderman, G. A., Rucklidge, W. J., "Comparing images using the Hausdorff distance," Pattern Analysis and Machine Intelligence, IEEE Transactions. Papers 15(9), 850–863 (1993).

[12] Fusiello, A., Trucco, E. and Verri, A., "A compact algorithm for rectification of stereo pairs," Machine Vision and Applications. Papers 12(1), 16–22 (2000).

參考文獻

相關文件

Then, we tested the influence of θ for the rate of convergence of Algorithm 4.1, by using this algorithm with α = 15 and four different θ to solve a test ex- ample generated as

Numerical results are reported for some convex second-order cone programs (SOCPs) by solving the unconstrained minimization reformulation of the KKT optimality conditions,

Particularly, combining the numerical results of the two papers, we may obtain such a conclusion that the merit function method based on ϕ p has a better a global convergence and

Then, it is easy to see that there are 9 problems for which the iterative numbers of the algorithm using ψ α,θ,p in the case of θ = 1 and p = 3 are less than the one of the

By exploiting the Cartesian P -properties for a nonlinear transformation, we show that the class of regularized merit functions provides a global error bound for the solution of

volume suppressed mass: (TeV) 2 /M P ∼ 10 −4 eV → mm range can be experimentally tested for any number of extra dimensions - Light U(1) gauge bosons: no derivative couplings. =&gt;

• Formation of massive primordial stars as origin of objects in the early universe. • Supernova explosions might be visible to the most

Lin, A smoothing Newton method based on the generalized Fischer-Burmeister function for MCPs, Nonlinear Analysis: Theory, Methods and Applications, 72(2010), 3739-3758..