Calibration Method for Line Structured Light Vision ... - IEEE Xplore

2 downloads 115 Views 827KB Size Report
e-mail: [email protected] ... each target image, and all the obtained vanish points form the ... all the calibration feature points on the planar target at.
2010 International Conference on Pattern Recognition

Calibration Method for Line Structured Light Vision Sensor Based on Vanish Points and Lines Zhenzhong Wei e-mail: [email protected]

Meng Xie * e-mail: [email protected]

Guangjun Zhang Beihang University, Key Laboratory of Precision Opto-mechatronics Technology, Ministry of Education Beijing, China e-mail: gjzhang@ buaa.edu.cn

move in precise pure translation, and Yang uses a 3D channeled target which must have two precisely parallel planes visible at the same time. In this sense, it seems difficult to get rid of 3D targets which have some defects in machining and using. Xiao confirms the parameter of β(the projector angle) and L(the baseline) of the light plane, and Yang determines the normal and L(the baseline) in sequence. And our methods also based on this two-step calibration idea. By exploring two derivative properties of vanish points and lines, we propose another way to calibrate LSLVS, and solving the two problems mentioned above: (1) using a planar target instead of special 3D target, and moving it randomly; (2) auxiliary facility-free, nor at least two times 3D coordinates unifications of the calibration feature points. By the two proved properties, we can find the vanish point of the light stripe by just one light stripe in one direction on the planar target. And naturally, it’s more simple and efficient.

Abstract—Line structured light vision sensor (LSLVS) calibration is to establish the location relationship between the camera and the light plane projector. This paper proposes a geometrical calibration method of LSLVS based on the property of vanish points and lines, by randomly moving the planar target. This method contains two steps, (1) the vanish point of the light stripe projected by the light plane is found in each target image, and all the obtained vanish points form the vanish line of the light plane, which is helpful to determine the normal of the light plane. (2) one 3D feature point on the light plane is acquired (one is enough, surely can be more than one) to determine d parameter of the light plane. Then the equation of the light plane under the camera coordinate system can be solved out. Computer simulations and real experiments have been carried out to validate our method, and the result of the real calibration reaches the accuracy of 0.141mm within the view field of about 300mm×200mm. Keywords-calibration; line structured light; computer vision

I.

II.

INTRODUCTION

zc

There are many LSLVS calibration methods which can be classified into two categories, according to their way to obtain 3D coordinates of calibration feature points. One is with the help of assistant apparatus [1, 2], and the other is not [3, 4]. The latter can be classified into two types further by the target type: 3D target [3] and 2D planar target [4]. 3D target is difficult to make and it has the problem of mutual occlusion between different planes, so 2D planar target method is more available in LSLVS calibration. Without restricts to the target moving, all the structure parameters can be readily calibrated on site by computing feature points under at least two different local world coordinate frames respectively. However, these methods always need to unify all the calibration feature points on the planar target at several different positions to the same given world coordinate frame, which will inevitably generate errors and convey them until the last step. On the other hand, geometric features as circulars and vanish points, have been widely used in camera calibration [5, 6], due to their computation easiness and the location-free property, but it has not earned equal attention in LSLVS calibration. Xiao [7] and Yang [8] have proposed a method relative to this aspect. In order to get at least two parallel light stripes on the light plane, so that the vanish point of the light stripe of the same direction could be easily computed out, Xiao uses an additional facilities to control the target to 1051-4651/10 $26.00 © 2010 Crown Copyright DOI 10.1109/ICPR.2010.200

MEASUREMENT MODEL OF LSLVS

P( xc , yc , zc )

π

o c xc yc

Figure 1. Measurement model of LSLVS

LSLVS is composed of a camera and a light plane projector, and the relationship between the two is fixed once the system is set up. And in the whole process of calibration and use, the light plane projected is unchangeable as well. The measurement model of LSLVS is illustrated in Fig. 1. ocxcyczc is the camera coordinate frame and OXY is the camera image coordinate frame. Here in our paper, we set the world coordinate frame (WCF) of the sensor to coincide with the camera coordinate frame, i.e. ocxcyczc is the WCF, so the equation of the light plane under WCF is: acxc+bcyc+cczc+dc=0

(1)

Supposing a light plane point P(xc,yc,zc) is under WCF, and its corresponding image point p(X,Y) is under OXY. Then we have the following equation (1), where f is the camera efficient focal length.

798 794

⎧ ⎪X = f ⎪ ⎨ ⎪Y = f ⎪⎩

xc zc

vanish line of the rectangle planer target [9]. GH is the image of the light stripe on the rectangle planer target projected by the light plane projector, and its extension intersects line EF at the point v. If the rectangle planar target is placed randomly at different positions in the camera view field, it will bring the case shown in Fig. 3. Then, we have the following derivations.

(2)

yc zc

When considering the camera lens radial distortion k1 and k2, we have: ⎧ X = X d (1 + k1r 2 + k2 r 4 ) ⎪ 2 4 ⎨Y = Yd (1 + k1r + k2 r ) ⎪r 2 = X 2 + Y 2 d d ⎩

(3)

where (Xd,Yd) is the corresponding distortion image point of P, k1, k2 are the lens radial distortion coefficients. Combining (1)-(3), we get the entire measurement model of LSLVS. In this paper, our LSLVS calibration task is to well obtain the normal vector n = [ac,bc,cc] and the parameter dc of the light plane, assuming the camera has already been calibrated. III.

Figure 2. Perspective projection of the rectangle planar target

Derivation 1 For a planar target with a light stripe on it, the image of the light stripe line will intersect the vanish line of the planar target plane at a point v, which is exactly the vanish point of the light stripe. Proofs: Given the equation of the light plane under the camera coordinate frame:

CALIBRATION METHOD

Our calibration method based on planar target can be carried out by the following two steps: 1. Compute the normal vector n of the light plane, by the properties of vanish points and the vanish line of the light plane proved in this paper; 2. Obtain the parameter dc of the light plane.

(xl-x0)Tn1=0 And the equation of the planar target plane: (xt-x0)Tn2=0

A. Acquisition of the normal n of the light plane

(4)

nLs= n1×n2

where K is the 3×3 intrinsic parameter matrix of the camera, and v is the image coordinate of point v. Vanish Line: Vanish line l of a plane π in 3D space is determined by its normal vector n, and the following equation stands: -T

L=K n

(7)

where xl is a point on the light plane and xt is a point on the target plane, x0 is a point on the intersection line Ls (i.e. the light stripe) of the two planes, n1, n2 are the normal vector of the two planes respectively. The direction of Ls can be computed as follows:

1) Relative concepts of vanish points and lines [9]. Vanish Point: Vanish point v of a line l in 3D space is determined by its direction d, and the following equation stands: v = Kd

(6)

(8)

Considering (4), the vanish point v of Ls is: v=K(n1×n2)

(9)

where v is in the image pixel coordinate form. Combining (5), the vanish line of the target plane is:

(5)

Lt=K-T n2

where K is the same as in (4), and L stands for the direction of vanish line l in 2D image. By (5), we can see that if the vanish line L of the light plane in the image has been obtained, the normal vector n of the light plane is determined as well. So there are two useful properties help to fulfill this task.

(10)

Based on (9) and (10), the following stands: LTt v = 0

(11)

As (11) shows, the vanish point v of Ls is rightly on the vanish line of the planar target plane.

2) Derivative properties of vanish points and lines used in LSLVS calibration Parallel nature does not remain under perspective projection, as shown in Fig. 2, AB and CD, AD and BC are not parallel in picture anymore, but converging to the intersections E and F respectively. In fact, line EF is just the

Derivation 2 The vanish points v of the light stripes on the planar target at different positions form the vanish line l of the light plane. Proofs: From the projective geometry view, one plane has a unique vanish line under some perspective projection,

795 799

and the vanish point of any line on the plane must be on the plane’s vanish line [9]. Based on these conclusions, see Fig.3, the light stripes on the planar target at different positions surely all belong to the light plane, so the vanish points of these light stripes form the vanish line of the light plane.

based on (14), (2) and (3). Substitute P0 for P in (1), then we could obtain dc. IV.

SIMULATIONS AND REAL EXPERIMENTS

A. Computer simulations The camera focal length f is about 12mm, and the resolution is 768×576. The camera work distance is about 400mm-500mm, and the view field is about 300mm×200mm. The planar target is 147 mm×147 mm. Gaussian noise with mean 0 and standard deviation of 0.01-1 pixel is added to both the target edge points and the light stripe points. At each noise level, the noise is added 100 times randomly, and the relative results are shown in Tab. 1. Row 2 shows the normal vector n bias to the true direction, computing the angle of the two, Row 3 shows the error of the parameter dc, and Row 4 displays the overall measurement error of the vision system when using the parameter calibrated by our method, and the true value of the length measured is 147mm.

Figure 3. Perspective projections of the rectangle planar target at different positions and the formation of the vanish line of the light plane

As a result, there is no need to unify the coordinates of vanish points of different light stripes, but fit these points to a line directly. 3) Acquisition of the normal vector n. Based on Derivation 1 and Derivation 2, the vanish line LL of the light plane can be obtained by at least two known vanish points of the light stripes, then from (5), we could get the normal vector of the light plane n, and K is the same as in (4):

TABLE I. Noise Level(pixel)

PARAMETER ERRORS VS. THE NOISE LEVEL Corresponding Errors

Normal vector n RMS Error (º)

Measurement RMS Error (mm)

Parameter dc RMS Error (mm)

0.05

0.012

0.042

0.102

0.08

0.016

0.045

0.148

0.10

0.020

0.058

0.187

0.15

0.033

0.104

0.276

0.20

0.041

0.131

0.361

0.25

0.060

0.177

0.510

B. Acquisition of the parameter dc

0.30

0.063

0.229

0.577

The normal vector nt of the planar target plane at any location can also be confirmed by the same way as the normal vector n of the light plane. Supposing the equation of the planar target plane is

0.35

0.070

0.242

0.598

0.40

0.119

0.276

0.833

0.45

0.122

0.333

0.986

0.50

0.127

0.366

1.008

n = K TLL

atxc+btyc+ctzc+dt=0

(12)

Row 2 and Row 3 show the fact that the bias of the normal vector n and the parameter dc of the light plane increase with the increasing noise level. In real application, the image processing accuracy is not difficult to reach 0.1 pixels usually, and at this noise level, we can see the bias of n and dc are 0.020º and 0.058mm respectively, which are relatively small. From Row 4, we can see that when the noise level is 0.1 pixels, the distance measurement error is 0.187 mm, which can meet the general measurement accuracy demands of real applications. Fig.4 further plots the dada in Row 4.

(13)

Here, nt = [at, bt, ct]T, and P(xc,yc,zc) is a point on the light plane. Next we will get dt.. Noticing the length of AB is known. Let the coordinates of points A, B, A', B' are A(xcA, ycA, zcA), B(xcB, ycB, zcB), A' (XdA', YdA'), B' (XdB', YdB'), respectively. Since the image coordinates of A' (XdA', YdA'), B' (XdB', YdB') and nt = [at, bt, ct]T are already computed, by (2), (3) and (13), A(xcA, ycA, zcA) and B(xcB, ycB, zcB) in spatial can be described by the only unknown parameter dt. And the following equation obviously stands:

1.4

A−B = L

Distance measurement error (mm)

1.2

(14)

L is precisely known, so dt can be determined. Thus, the feature point P0(xcp, ycp, zcp) on the light stripe lying on the planar target, whose corresponding image coordinate are p'0(Xdp', Ydp'), we can get P0(xcp, ycp, zcp) from p'0(Xdp', Ydp')

1

0.8

0.6

0.4

0.2

0 0

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

Image noise lever (pixels)

Figure 4. Measurement error vs. image noise level

796 800

It is understandable that if one group of parallel lines on the planar target contains more than two lines as in this paper, the calibration accuracy results should be improved. However, in reality, more than two parallel lines could not exactly intersect at one vanish point, so optimization method must be used [9].

dMea(mm) err (mm) No. dTru (mm) dMea(mm) err (mm) RMS(mm)

B. Real experiments Experiment conditions The camera used is Mintron468P with the resolution 768 pixel×576 pixel; The view field is about 300mm×200mm; The distance between the camera and the planar rectangle target is within 400mm-500mm. The intrinsic parameters of the camera are shown in Tab. 2.

αx (pixel)

αy (pixel)

1314.40

1312.64

u0 (pixel) 391.61

v0 (pixel) 342.98

k1

k2

-0.194

0.114

140.365 0.211

140.948 -0.194

145.479 0.032

175.053 0.137

6

7

8

9

10

146.921 147.160 0.239

150.211 150.327 0.116

144.541 144.563 0.022

146.076 146.192 0.116

164.188 164.081 -0.107

0.141

In fact, as the target used in the experiment merely has the accuracy of 0.1mm and only two parallel lines in each group and two groups, we can say that the accuracy of 0.141mm is actually relatively high. If the target machining accuracy is improved further, and there are more parallel lines with more directions on the target, the RMS error will naturally decrease. The measurement accuracy in [8] is 0.234mm in distance RMS error.

INTRINSIC PARAMETERS OF THE CAMERA

TABLE II.

155.337 0.037

V.

CONCLUSIONS

The calibration method based on planar target proposed in this paper indicates its practical use in 3D measurement. It utilizes a planar target moving randomly without knowing its precise position and orientation, but by two proved properties of vanish points and lines to obtain the vanish points of the light stripes, and has the advantage that it avoids the needs for auxiliary facility and a special 3D channeled target, which improves the calibration efficiency and convenience. And the method of this kind also avoids the computation on the unification of feature points.

The size of real planar rectangle metal target used is 150.5mm×140mm, and the machining accuracy is 0.1mm, which is shown in Fig. 5, with a light stripe on it. In Fig. 5, the light stripe intersects the two edges of the target at two points (here we call them edge feature points), and we use the distance of the two edge feature points to assess the measurement accuracy of the calibrated LSLVS.

ACKNOWLEDGMENT This work is supported by the National Natural Science Foundation of China (50875014), and Program for New Century Excellent Talents in University of China Education Ministry (NCET-07-0043).

Figure 5. Images of the planar rectangle target with light stripe.

REFERENCES

Step 1: Obtain the true value of the distance In Fig. 5, we can see that the edge containing the intersections with the light stripe has three obvious feature points: one intersection point (i.e. feature point) and two end points. Actually, the two image lines of the parallel edges determine a vanish point which corresponds to the infinity point in 3D space. Thus, based on the invariance of crossratio relative to the infinity point, the true 3D coordinates of the two edge feature points can be obtained [6], and the distance between the two feature points is considered as the true value. Step 2: Accuracy assessment results Randomly put the planar rectangle target at 10 different places, so 10 distances are obtained which are given in Tab.3. Comparing to the simulation result in Row 4 of Tab.1, we can see that the measurement error of the real experiment well meets the simulation error with the image noise between 0.08-0.1 pixels. In practice, the accuracy of image processing could easily reach 0.1 pixels. dTru stands for the true value, and dMea stands for the measured value.

[2]

[3]

[4]

[5]

[6] [7]

[8]

EXPERIMENT RESULTS ON DISTANCE

TABLE III. No. dTru (mm)

[1]

1

2

3

4

5

155.300

140.154

141.142

145.447

174.916

[9]

797 801

Dewar R, “Self-generated targets for spatial calibration of structured light optical sectioning sensors with respect to an external coordinate system,” Proc. of the Robots and Vision’88 Conference, Detroit,1988, pp. 5–13. Chen C H, Kak A C, “Modeling and calibration of a structured light scanner for 3D robot vision,” Proc. IEEE Conf. Robotics and Automation, 1987, pp. 807-815. Duan F, Liu F, Ye S, “A new accurate method for the calibration of line structured light sensor,” Chinese J. Sci. Instr., 21(1), 2000, pp:108–109. F.Zhou, G.Zhang, “Field calibration method for line structured light vision sensor,” Chinese Journal of Mechanical Engineering, 2004, 40(6), pp. 169-173. Janne Heikkila, “Geometric camera calibration using circular control points,” IEEE Trans. Pattern Anal. Mach. Intell., 22(10) ,2000, pp. 1066-1073. Mei X.M., Liu Z.X. Higher Geometry, Higher Education Press, 2008, pp. 46-48. XiaoHai, LuoMing, “A line structured light 3D visual sensor calibration by vanishing point method,” Opto-Electronic Enginerring, 23(3), 1996, pp:53-58. YangPei, WuLing, “A rapid calibration method for laser light vision sensor,” Laser Journal, 27(4),2006, pp.35-36. Richard Hartley, Andrew Zisserman, Multiple View Geometry in Computer Vision, Cambridge University Press, 2000.