Multi-sensor Robot Navigation System

0 downloads 0 Views 242KB Size Report
present approach we have neglected earth rotation and gravity variations, because ... our low-cost ISA (Inertial Sensor Assembly) and because of the relatively ...
Header for SPIE use

Multi-sensor Robot Navigation System Stelian Persa*, Pieter Jonker Pattern Recognition Group, Technical University Delft Lorentzweg 1, Delft, 2628 CJ,The Netherlands ABSTRACT Almost all robot navigation systems work indoors. Outdoor robot navigation systems offer the potential for new application areas. The biggest single obstacle to building effective robot navigation systems is the lack of accurate wide-area sensors for trackers that report the locations and orientations of objects in an environment. Active (sensoremitter) tracking technologies require powered-device installation, limiting their use to prepared areas that are relative free of natural or man-made interference sources. The hybrid tracker combines rate gyros and accelerometers with compass and tilt orientation sensor and DGPS system. Sensor distortions, delays and drift required compensation to achieve good results. The measurements from sensors are fused together to compensate for each other's limitations. Analysis and experimental results demonstrate the system effectiveness. The paper presents a field experiment for a low-cost strapdown-IMU (Inertial Measurement Unit)/DGPS combination, with data processing for the determination of 2-D components of position (trajectory), velocity and heading. In the present approach we have neglected earth rotation and gravity variations, because of the poor gyroscope sensitivities of our low-cost ISA (Inertial Sensor Assembly) and because of the relatively small area of the trajectory. The scope of this experiment was to test the feasibility of an integrated DGPS/IMU system of this type and to develop a field evaluation procedure for such a combination. Keywords: Outdoors navigation system, hybrid tracker system, sensor information fusion

1. INTRODUCTION The paper presents a field experiment for a low-cost strapdown-IMU(Inertial Measurement Unit)/GPS combination, with data processing for the determination of 2-D components of position (trajectory), velocity and heading. In the present approach we have neglected earth rotation and gravity variations, because of the poor gyroscope sensitivities of our low-cost ISA (Inertial Sensor Assembly) and because of the relatively small area of the trajectory. The scope of this experiment was to test the feasibility of an integrated GPS/IMU system of this type and to develop a field evaluation procedure for such a combination. A fundamental requirement for an autonomous robot is its ability to localize itself with respect to its environment. Navigation on flat and horizontal ground only requires estimation of position and headings. However, in many cases, the environment is not so well structured, and angular orientation of robot may change along its path. In this case, a real-time estimation of vehicle it is necessary. Also, some specific tasks, like those executed by robots for road construction, require controlling the attitude of a tool. If the robot move on inclined plane, the orientation can be used to correct the position and heading information provided by odometry, or to interpret external sensors data (vision, laser range,…) in order to build an accurate model of the environment. With a plethora of different graphics applications that depend on motion-tracking technology for their existence, a wide range of interesting motion-tracking solutions have been invented. Surveys of magnetic, optical, acoustic, and mechanical tracking systems are available1,2. Examples of such systems use magnetic, optical, radio, and acoustic signals. Passive-target systems use ambient or naturally occurring signals. Examples include compasses sensing the *

Correspondence: Stelian Persa. Other author information: S.P.; Email: [email protected]; WWW: http://www.ph.tn.tudelft.nl/~stelian; Telephone: 31 15 278 3727. P.J.; Email: [email protected];

Earth’s field and vision systems sensing intentionally placed fiducials ( e.g., circles, squares) or natural features. Inertial systems are completely self contained, sensing physical phenomena created by linear acceleration and angular motion. Many mobile robotic applications require only motion over a small region, and these traditional tracking approaches are usable, although there are still difficulties with interference, line-of sight, jitter and latency3,4. The system presented by Behringer5 can achieve high output rate due to the simple image processing of matching silhouette, but it is not operable in an urban environment. The image processing system described by Harris6 can track simple models like a model Hornet aircraft, cylindrical objects and patches on the ground, but will fail in an urban environment, where multiple objects with complex wire-frame models are present in the image. Schmid7 best presents the problem of line matching, but unfortunately their approach uses stereo vision, and is very computation intensive. New available gyroscope and inertial systems represent a better solution for the tracking system. However, a driftcorrected inertial tracking system is only able to track a 3-DOF orientation. Accelerometers and gyroscopes are very fast and accurate, but due to their drift in these sensors, they have to be reset regularly, in the order of once per second depending on the drift of the gyroscopes and accelerometers. To correct positional drift in a 6-DOF inertial tracking system, some type of range or bearing measurements to fiducial points in the environment is required. Each tracking approach has limitations. Noise, calibration error, and the gravity field impart errors on these signals, producing accumulated position and orientation drift. Position requires double integration of linear acceleration, so the accumulation of position drift grows as the square of elapsed time. Orientation only requires a single integration of rotation rate, so the drift accumulates linearly with elapsed time. Hybrid systems attempt to compensate for the shortcomings of each technology by using multiple measurements to produce robust results. The paper is organized as follows. Section 2 describes our approach. Section 3 presents the system overview, sensor calibration and sensor fusion and filtering. The results and conclusions are presented in Section 4.

2. APPROACH Outdoor mobile robotic applications have rarely been attempted because building an effective outdoor mobile robotic system is much more difficult than building an indoor system. First, fewer resources are available outdoors. Computation, sensors and power are limited to what a robot can reasonably carry. Second, we have little control over the environment outdoors. In an indoor system, one can carefully control the lighting conditions, select the objects in view, add strategically located fiducials to make the tracking easier, etc. But modifying outdoor locations to that degree is unrealistic, so many existing mobile robotic tracking strategies are invalid outdoors. Finally, the range of operating conditions is greater outdoors. No single tracking technology has the performance required to meet the stringent needs of outdoors mobile robotic. However, appropriately combining multiple sensors may lead to a viable solution faster than waiting for any single technology to solve the entire problem. The system described in this paper is our first step in this process. To simplify the problem, we assume the real-world objects are distant (e.g., 50+ meters), which allows the use of GPS for position tracking. Then we focus on the largest remaining sources of registration error (misalignments between virtual and real): the dynamic errors caused by lag in the system and distortion in the sensors. Compensating for those errors means stabilizing the position and orientation against robot motion. We do this by a hybrid tracker, combining rate gyros with a compass and tilt orientation sensor. The inertial data are processed in a strapdown mechanization8,9,10, based on the following expression for a onecomponent specific force in a body reference system (see Figure 1, that explains the forces considered, acting upon the seismic mass of the accelerometer),

Figure1. Specific force as a function of acceleration components along a reference system firmly attached to the moving body (x-axis) b as a function of the linear acceleration a x , the apparent centripetal acceleration abcf_x and the corresponding axial component of the static gravitational acceleration gbx (the superscripts b denote the vector components in the body reference system):

f m_x = axb + acfb _ x − g xb .

(1)

The corresponding vectorial form (with the specific force vector now denoted by a and the correction terms of centripetal and gravity acceleration expressed in the body coordinate system) is:

a b = a − ω × vb + Cnb g n .

(2)

with: ω = the angular velocity vector, vb = the velocity vector, given in the coordinate system b, and Cbn = the rotation matrix from the local coordinate system n to the body coordinate system b. Roll-pitch-yaw angles (φ,ψ,θ) can be used to represent the attitude and heading of the robot. If the direction cosines matrix C, defining the attitude and the heading of the robot is given, the roll-pitch-yaw angles can be extracted as follows:

 sx C =  s y  sz

nx ny nz

ax  a y  az 

   sy  − sz  ,  ± kπ , ψ = arctan  cos( θ ) sin( θ ) + s s  sx  x y  

θ = arctan 

 sin(θ )ax − cos(θ )a y  − sin(θ )n + cos(θ )n x y 

ψ = arctan 

(3)

  . 

The attitude can be determined using gyrometric measurements. This method also allows to estimate the heading (yaw), which is not possible with the accelerometers or inclinometer. In this case, a differential equation relating the attitude and the instantaneous angular velocity has to be integrated. Roll, pitch and yaw angles are used as output of the system

to define robot’s attitude and heading because they have direct physical interpretation, but this representation is not used in the differential equation. We choose to use quaternions mainly because they never lead to singularities. Using quaternions, the differential equation to be solved takes the form:

Q! 0   0 − p − q − r  Q0  !  p 0 r − q   Q1  Q1  1  1  ! Q = QΩ, or = Q! 2  2  q − r 0 p  Q2  2 !     Q3   r q − p 0  Q3 

(4)

where Q=Q0+Q1.i+Q2.j+Q3.k is the quaternion associated with the attitude of the robot, and Ω=[p q r]T is its instantaneous angular velocity. A numerical integration method must be used to solve this equation. We use the forthorder Runge-Kutta integration algorithm, which performed the best when compared with rectangular or trapezoidal method. The direction cosines matrix can be expressed in terms of quaternion components by:

Q02 + Q12 − Q22 − Q32  C=  2 ( Q1Q2 + Q0Q3 )  2 ( Q1Q3 − Q0Q2 ) 

2 ( Q1Q2 − Q0Q3 ) Q −Q +Q −Q 2 ( Q0Q1 + Q2Q3 ) 2 0

2 1

2 2

2 3

2 ( Q1Q3 + Q0Q2 )   2 ( Q2Q3 − Q0Q1 )  Q02 − Q12 − Q 22 + Q32 

(5)

The flow-chart of the strapdown navigation algorithm implementing the equation presented above is presented in Figure 2, and the result of the run of this system with simulation data is shown in Figure 5.

Ax

Ay

Az

Gx

Gy

Gz

Acceleroneters Signal Correction - scale factor - bias - drift - temperature - nonorthogonality

Gyroscope Signal Correction - scale factor - bias - drift - temperature - nonorthogonality

Centrifugal Force Correction

wx _ b  vx _ b      − wy _ b  × vy _ b  wz _ b  vz _ b 

wx _ b    wy _ b  wz _ b 

ψ  θ   Attitude Integration  φ  

! ψ ψ  θ!  = [ f (ψ, θ, φ)] θ     !  φ  φ

ax _ b    ay _ b  a  Gravity Correction  z _ b 

 − sθ  + g • cθ* sψ   cθ* cψ

DGPS

vx _ n  Position Information   vy _ n  vz _ n 

vx _ b    vy _ b  vz _ b  Acceleration Integration t

∫ a(τ)dτ + v

b t

Rate Integration

ROTATION

Cbn

0

(Cbn)T Rotation Matrix

Cbn = [ f2 (ψ, θ, φ)]

Inclinometer + Magnetometer

Figure 2. Flow-chart of the strapdown mechanization

t

∫ v (τ)dτ + s n

0

n t

We neglected the g - variations and the Earth rotation rate, because of the small dimensions of the test area, of the relative low people velocities (about 1 m/s) and of the reduced rate sensitivity of the used gyroscopes. Also we neglect the small Coriolis force acting on the moving mass as a consequence of the rotation of the inertial sensors case.

3. SYSTEM 3.1 Overview Figure 3 shows the system dataflow. Three sets of sensors are used: the Garmin GPS 25 LP receiver combined with an RDS OEM4000 system to form a DGPS unit, a Precision Navigation TCM2 compass and tilt sensor, and three rate gyroscopes (Murata) and three accelerometer (ADXL202) combined in one board, linked directly to LART platform11. The LART platform contains an 8-channel fast 16 bit A/D converter to acquire synchronous data from accelerometer, gyros and in future temperature data. Real-time temperature information can be useful to compensate the temperature drift sensitive component of the sensors. GPS 25+ OEM 4000

TCM 2

IM U board

RS-232 LART Platform

Figure 3. System dataflow The Garmin GPS provides outputs at 1 Hz, with 10 meters typical error, and 2-3 meter typical error in DGPS configuration. The TCM2 updates at 16 Hz and claims ±0.5 degrees of error in yaw. The gyros and the accelerometers are analog devices, which are sampled at 100 Hz or higher by an AD converter, and linked to LART board. The other two sensors are read via serial lines. An LART board developed at Delft University, based on Intel StrongArm processor reads the sensors. The LART mobile low-power board is presented in Figure 4.

Figure 4. The LART board Section 3.2 describes the sensor distortions and calibration required. The DGPS sensor directly provides the position, but the other two sensor outputs are fused together to determine the orientation, as described in Section 3.3. Computed

robot location by the LART board will be then passed to the robot to be used for navigation. The software that reads from data from serial ports and fuses the data is a near real time set of threads and processes running under Linux. 3.2 Sensor Calibration Compass Calibration: We found the TCM2 had significant distortions in the heading output provided by the compass, requiring a substantial calibration effort. Besides the constant magnetic declination, the compass is affected by local distortions of Earth’s magnetic field. With a non-ferrous mechanical turntable is possible to measure these errors. The distortions can have peak-to-peak values of about 2 degrees. Unfortunately, it is difficult to build a working mobile robot that does not place some sources of magnetic distortion in the general vicinity of the compass. In the real system, compass errors can have peak-to-peak values of 5 degrees12. Fortunately TCM2 has an internal calibration procedure, which can take in account a static distortion of magnetic field. For dynamic distortions the TCM2 provides us with an alarm signal, which is active when such error occurs, and then we can ignore the compass measurement and rely only on gyro. Gyroscope Calibration: We measured the bias of each gyroscope by averaging several minutes of output while the gyros were kept still. For scale, we used the specified values in the manufacturer’s test sheets for each gyro. Using the calibration data for the inertial sensor assembly (bias, linear scale factors, gyroscopes triad non-orthogonality) delivered from the manufacturer and the supplementary calibration measurements made in our laboratory the error model of the inertial sensors is validated. The most important measurements are: the evaluation of the noise behavior of the inertial data sets, static gyro calibrations - to determine the supplementary non-linear terms of the static transfer characteristics, considered only to degree 2 -, as well as the establishment of the non-linear time and temperature behavior of the gyro’s drift and scale factors and the non-orthogonality of the gyro’s triad. Sensor Latency Calibration: The gyro outputs change quickly in response to robot motion, and they are sampled at 100 Hz. In contrast, the TCM2 responds slowly and is read at 16 Hz over a serial line. Therefore, when TCM2 and gyro inputs are read simultaneously, there is some unknown difference in the times of the physical events they each represent. It is possible to determine the relative latency by integrating the gyro outputs and compare with compass readouts by shifting one data in time till they best match. We took in account the relative latency by attaching to each sensor readout a time tag. This will be taken in account in fusion step. 3.3 Sensor Fusion and Filtering The goal of sensor fusion is to estimate the angular position and rotation rate of the head from the input of the TCM2 and the three gyroscopes. This position is then extrapolated one frame into the future to estimate the head orientation at the time the image is shown on the see-through display. When the robot is not moving, we estimate roll and pitch from inclinometer measurements and accelerometer measurements. We use redundant information from accelerometer in order to get a better precision. Roll and pitch are computed from the component of gravity in body frame, which are directly measured by accelerometers. The expressions of attitude angles as a function of the gravity in the body frame are:

 gx    g 

ψ = − arcsin 

    gy gz φ = arcsin   , or φ = arccos   . g g *cos ψ *cos ψ ( ) ( )    

(6)

To predict the head orientation one frame into the future, we use a linear motion model: we simply add to the current orientation the offset implied by the estimated rotational velocity. This is done by converting the orientation (the first 3 terms of x) to quaternion and using quaternion multiplication to combine them. We will incorporate more sophisticated predictors in the future.

4. RESULTS AND CONCLUSIONS For moderate head rotation rates (under ~100 degrees per second) the largest registration errors we usually observed were ~2 degrees, with average errors being much smaller. The biggest problem was the heading output of the compass sensor drifting with time. The output would drift by as much as 5 degrees over a few hours, requiring occasional recalibration to keep the registration errors under control. The magnetic environment also could influence the compass error, for short time we can compensate that by using only the gyro readings. In the paper some preliminary results are presented from a GPS-aided integrated trajectory solution for a low-cost strapdown mechanized IMU. The precise DGPS reference trajectory will enable the elaboration of a post-processing field evaluation methodology for the low-cost strapdown IMU. The obtained results encourage to more comprehensive investigations: drift modeling of the inertial sensors in the alignment procedure, calibration of the inertial sensors error sources. Because we were primarily interested to establish the integrated system feasibility, we have not modeled too extensively the actual inertial sensors. We intend to extend our analysis in order to achieve higher precision of the integrated solutions by the using an extended Kalman filter (EKF). Accelerometer biases, gyroscope drifts and inertial sensor scale-factor errors could be included together with appropriate stochastic models - in order to better compensate for the systematic sensor errors. Furthermore, an increase of the inertial data acquisition rate would permit a better approximation of the non-linear dynamic model by a linear one. Finally, for a complete dynamic model one could consider the g-variations and the influence of the earth rotation, which enables the application of that analyze to more accurate IMUs, too.

ACKNOWLEDGMENTS This work was done in the framework of Ubiquitous Communications Program (www.ubicom.tudelft.nl).

REFERENCES 1.

2.

3. 4.

5. 6. 7. 8. 9.

Youngblut C, Johnson RE, Nash SH, Wienclaw RA, Will CA., “Review of Virtual Environment Interface Technology”, Internal report P-3186, Institute for Defense Analyses (IDA), Alexandria, VA, 1996, Available Internet: http://www.hitl.washington.edu/scivw/IDA/. Borenstein J, Everett HR, Feng L., “Where am I? Sensors and Methods for Mobile Robot Positioning”, Technical Report, University of Michigan, 1996, Available Internet: http://wwwpersonal.engin.umich.edu/~johannb/position.htm. Foxlin, Eric, "Inertial Head-Tracker Sensor Fusion by a Complementary Separate-Bias Kalman Filter", Proceedings of VRAIS ‘96 (Santa Clara, CA, 30 March - 3 April 1996), 185-194 Foxlin, Eric, Mike Harrington, and George Pfeiffer, "Constellation: A Wide-Range Wireless Motion-Tracking System for Augmented Reality and Virtual Set Applications", Proceedings of SIGGRAPH ‘98 (Orlando, FL, 19-24 July1998), 371-378 Behringer R., “Registration for outdoor augmented reality applications using computer vision techniques and hybrid sensors”, Proc. IEEE Virtual Reality, 244-251, 1999. Harris C., “Tracking with Rigid Models”, In A. Blake, Yuille (Eds), Active Vision, pp.59-74. MIT Press, 1992. Schmid C, Zisserman A., “The geometry and matching of lines and curves over multiple views”, International Journal on Computer Vision, 40 (3), pp.199-234, 2000. Titterton, D., H., Weston, J., L.: “Strapdown inertial navigation technology“, IEE Books, Peter Peregrinus Ltd., UK, 1997 Jay A. Farrell, M. Barth, "The Global Positioning System & Inertial Navigation", McGraw-Hill, 1999

10. R. Dorobantu, "Field Evaluation of a Low-Cost Strapdown IMU by means GPS", Ortung und Navigation, 1/1999, DGON, Bonn 11. Bakker,J.D., Mouw, E.,Joosen,M., Pouwelse,J. (2000). "The LART Pages". Delft University of Technology, Faculty of Information Technology and Systems Available Internet:http://www.lart.tudelft.nl. 12. Azuma Ronald, Bruce Hoff, Howard Neely III, Ron Sarfaty, "A Motion-Stabilized Outdoor Augmented Reality System", Proceedings of IEEE VR '99, Houston, TX, 13-17 March 1999, 252-259

Figure 5. Output of the strapdown mechanization system using simulation data