• 沒有找到結果。

Corner-Based Optical Flow Estimation

Chapter 5 Vehicle Detection and Tracking for Traffic Parameter Estimation

5.3 Image Tracking and Traffic Parameter Estimation

5.4.1 Corner-Based Optical Flow Estimation

The vehicle motion field can be computed by identifying corresponding pairs of points in two successive image frames taken at time t and t+∆t. The points must be sufficiently distinct so that they can be identified and located in both images. Corner features are selected as the identified points. In this design, the Harris corner detector is adopted to extract the corner points from traffic imagery because of their superior repeatability, robustness to viewpoint changes, and resistance to illumination variation [78].

Once a set of interesting points {

P } is identified in the image frame I

i 1 taken at time t, the corresponding points must be identified in the successive image frame I2 taken at time

t+∆t. As shown in Fig. 5-10, given a sample point P at (Tx,Ty) in image frame I

1, we take a square block centered at P in I1 and find the best-correlated block in I2 under the assumption that the amount of movement is limited. If the best match is found at (Hx,Hy), then the point

(Hx,Hy) will be the destination of point P. The vector from (Tx,Ty) to (Hx,Hy) is the motion

vector of point P.

In the searching process described above, it is assumed that pixels belonging to the block are displaced with the same amount. However, this assumption sometimes fails to hold in a traffic scene. The vehicle image gradually changes due to the viewpoint of camera and vehicle motion behavior. It is difficult to obtain the same blocks of a moving object in two successive frames. Still, a certain similarity exists between these two blocks. The more the corresponding pixels that have similar intensities, the more correspondence exists between these two blocks. We use a minimum absolute difference of individual pixels as an error criterion for examining the correspondence between two blocks [79]. For two N×N blocks

A and B, the matching distance is defined by

79

Fig. 5-9. Detection windows.

∑ ∑

=

N

r c N

c r d D

1 1

) ,

( , (5.24)

where d( cr, ) is 1 if A(r,c)−B(r,c) <T, otherwise, 0; T is a threshold value. The matching distance is used as the similarity index of two blocks. Two blocks that produce the largest matching distance are the best match.

For a

N

×

N

block with maximum absolute displacement represented by [ max max dc

dr ], a full search calls for evaluation of the matching criterion at (2

d

rmax +1)×(2

d

cmax +1)blocks.

Searching for the best-match block might fail to fulfill the real-time requirement. In order to speed up the search process in this design, the search operation is executed only on blocks containing a detected corner. As shown in Fig. 5-10, there are two corner points in the search region of image I2. Thus, the matching algorithm performs evaluation of the matching check only at 2N blocks. In practical realization, the absolute displacement was selected as 2

80

Fig. 5-10. Motion vector detection.

max 5

max = c =

r

d

d

and

N

=3, the evaluation execution cycles are reduced from 121 to 18 times.

The design procedure of the average direction measurement of moving vehicles using the proposed optical flow method is summarized as follows:

1)

Use a Harris corner detector to extract the corner points from the current image frame.

Each extracted corner point and its neighboring pixels are selected as check points.

2)

Record the positions of all corner points as well as the check points for the correlation operation.

3)

Choose a 3×3 region centered at one corner point of the preceding image frame.

4)

Get a

N

×

N

region centered at the corresponding corner point of step 3 from the current image frame. The sub-region centered at the check point is assigned as check sub-region in the selected region.

81

5)

Measure the matching distance between the 3×3 region of step 3 and all check sub-regions of step 3.

6)

If the maximum matching distance of step 5 is greater than a specific threshold, the center of the check sub-region with which the matching distance is the maximum is set as a new position of the corresponding corner point of step 3 in the current image frame.

Thus, the motion vector of the corner point in the preceding image frame is obtained.

7)

Repeat steps 3-6 to estimate all motion vectors of corner points in the preceding image frame.

Figure 5-11 depicts a test result of the optical flow estimation. In the figure, small arrows indicate the detected motion direction of interesting points of vehicles. An average motion vector is obtained by taking into account motion vectors of all interesting points. Some errors can be observed in the figure. These errors are mainly caused by imperfect corner selection and match searching. However, such estimation errors can be averaged out in most cases and the vehicle moving direction can still be obtained for motion-type discrimination with high accuracy. In Fig. 5-11, the direction of the big arrow indicates the average motion vector of

Fig. 5-11. Result of optical flow estimation.

82

vehicles in the specific region. It accurately reflects the motion type of vehicles in this region.

According to the information of the estimated direction, the system can identify where the vehicles come from and determine the turn ratio accordingly.

5.5 Summary

An automatic contour initialization procedure has been developed for image tracking of multiple vehicles based on active contour and image measurement approach. A method is proposed to detect moving vehicles of various sizes and generate their initial contours for image tracking in a multi-lane road. The proposed method is not constrained by the lane boundary. The automatic contour initialization and tracking scheme have been proposed for traffic monitoring. Moreover, in company with motion vector estimation, the special designed window can be used to measure the vehicle turn ratio in real-time. An algorithm for estimating motion vector based on corner correlation has been designed to meet the real-time requirement.

83

Chapter 6

Experimental Results

6.1 Introduction

In this chapter, five experimental results of traffic parameter estimation are presented and evaluated. First three experiments under a rare shadow condition evaluate the algorithms of vehicle detection and tracking. First, cars and motorcycles are simultaneously detected and tracked by the proposed method. Second, vehicles are detected and tracked for traffic parameter estimation. Thirdly, a stand-alone traffic surveillance system has been implemented to estimate the vehicle turn ratio at an intersection. The rest experiments are concerned with not only the performance of tracking algorithm but also the capability of shadow suppression.

One test evaluates the shadow suppression and the tracking performance of the proposed algorithm, and the other verifies the proposed method for turn ratio estimation. The test video images under different shadow conditions were recorded in advance at an expressway as well as an intersection near National Chiao Tung University. In both cases shadows attached to their respective moving vehicles will degrade the performance of traffic monitoring if the ITMS has no shadow suppression function. Nevertheless, the ITMS should be able to detect and track the moving vehicles even when the shadows appear in the traffic scene. In the experiments, we not only validate the feasibility of the shadow suppression algorithm but also estimate traffic parameters such as traffic flow, traffic density, vehicle speeds, and vehicle turn ratios with functional accuracy.

The rest of this chapter is organized as follows. Four experiments under a rate shadow

84

condition are performed to examine the performance of the proposed method for traffic parameter estimation in Section 6.2. Practical experimental results of traffic parameter estimation under shadow condition are presented in Section 6.3. Section 6.4 shows the experiment of turn ratio estimation under shadow condition. Section 6.5 gives some concluding remarks.

6.2 Traffic Parameter Estimation