International Conference on Control, Automation and Systems 2010 Oct. 27-30, 2010 in KINTEX, Gyeonggi-do, Korea

A Position Tracking Experiment of Mobile Robot with Inertial Measurement Unit (IMU)

Olarn Wongwirat and Chutchai Chaiyarat

Faculty of Information Technology, King Mongkut’s Institute of Technology Ladkrabang, Bangkok, Thailand (Tel : +662-723-4900; E-mail: olarn@it.kmitl.ac.th, s2660736@kmitl.ac.th) Abstract: Currently, a position tracking of mobile robot becomes increasingly important. It can be not only used to track the moving path of mobile robot for searching and identifying its current position, but also to support the operation control of mobile robot remotely. The challenge in position tracking of mobile robot is to find its location related to environment. There are several tracking techniques using x, y coordinate and heading direction to locate the position in global coordinate system. However, these techniques lack the position information in z dimension. This paper proposes the method to track the position of mobile robot by using the Inertial Measurement Unit (IMU) that can provide the x, y and z coordination. The experiments were set in this work to verify the results of IMU signal while tracking the position of mobile robot. The results express some limitation characteristics of low-cost IMU sensor to provide accurate information. Therefore, some methodology is required to improve the accuracy of tracking position as suggested in the future work. Keywords: Localization, position tracking, mobile robot, inertial measurement unit (IMU)

1. INTRODUCTION

Localization of mobile robot is a method to identify the robot position while moving or changing its location in the environment. The method can track the moving path of mobile robot by verifying the changed position over time. There are several methods to determine the position and velocity of mobile robot with respect to a known reference in navigation computing. The common reference used is the Earth coordinate frame. Recently, various approaches in position estimation of mobile robot have been developed. They could provide different results depending on the environment. For instance, a wheel encoder has been used widely for tracking the position of mobile robot, but the wheel slip or uneven surface causes poor estimation in case of wheel revolution [1]. The localization based on Sonar, or GPS signal, typically requires mobile robot capability to receive signal in the environments, which is its restriction [2]. The use of inertial sensor to provide acceleration and heading for position and orientation estimation is another good method that can be applicable to mobile robot arena. The method can be found in various applications, e.g., aircraft, ship, automobiles, game consoles and virtual reality applications. Basically, the method estimates the position based on a know starting-point. There were research works based on inertial navigation with gyroscopes and accelerators employed to estimate the position of robot in general. In [3], Barshan and Durrant Whyte used twice acceleration

integration for estimating the position and integrated the angular velocity for determining the moving direction of mobile robot body. However, the integration of signals received from the inertial sensor with noise, results in error accumulation over time, e.g., increased drift and bias of signal. It causes the mobile robot to have the accurate position in a short period of time. This paper proposes the initiative work that employs the inertial sensor for tracking the position of mobile robot. The aim of this work is to verify a possibility in applying the inertial sensor with its noise restriction to estimate the tracking position of mobile robot. The IMU (Inertial Measurement Unit) is used as the sensor in this work. It is a low-cost and small size sensor suitable to attach to the body of mobile robot. The IMU comprises three orthogonal accelerometers and three gyroscopes aligned on the three axes to estimate the position, velocity, and attitude with respect to the environment. The paper is organized as follows: Section 2 describes a background of tracking methodology. The experimental set to determine the tracking result of mobile robot is exposed in section 3, including its analysis and discussion. Section 4 provides the conclusion.

2. TRACKING METHODOLOGY

This section describes the background of tracking methodology used in this work. It includes the IMU sensor deployed to the mobile robot, the methods used to calculate the acceleration and the angular rate, and the model used for navigation.

978-89-93215-02-1 98560/10/$15 ?ICROS

304

2.1 Inertial measurement unit (IMU) In this work, the IMU sensor from Sparkfun Electronics [4], which is the 6-DOF (degree of freedom) IMU version 2.0, is used for the experiment, as shown in Fig. 1. It composes of three axes accelerometers using the MMA7260Q chip from Freescale and three gyroscopes using the ADXRS300 chip with ±300 degree/second from Analog. Thus, it could provide acceleration measurements in three dimensions and angular rate about the three axes, respectively. The IMU sensor will be attached on the body of mobile robot to measure the acceleration and angular rate. The output signals from IMU sensor are converted by 10-bit A/D (Analog to Digital converter) into a raw value and transferred via the Bluetooth.

v = v0 + ∫ adt = v0 + at

0

t

x = x0 + ∫ vdt = x0 + v0 ∫ dt + a ∫ tdt

0 0 0

t

t

t

(3)

1 x = x0 + v0t + at 2 2

2.3 Coordinate transformation The coordinate transformation is used to transform acceleration vectors from the body of mobile robot frame down to the navigation frame [5]. By this transformation, the rotation of sensors on each axis can be tracked. The coordination rotation matrix related to the accelerations in the navigation frame can be built by the three-rotation matrix using Euler angles (roll, pitch and yaw). In three-dimension space rotation, the coordinate transformation matrix C about the x, y, and z axes can be achieved by using the rotation matrix, as in (4),

C = Rz R y R x

(4)

where R is the rotation matrix on each axis. It can be expressed in term of DCM (Direction cosine matrix) by using equation (5).

Fig. 1 Sparkfun 6-DOF IMU ver. 2.0. 2.2 Instantaneous acceleration The constant acceleration is a mean rate of velocity changed in a short time interval. The changed of velocity can be derived from the time integral of acceleration a, as in (1).

? θ cψ c ? C = ? θ cψ s ? ?θ ? s

n b

?φcψ s + φsθ sψ c φcψ c + φsθ sψ s

φsθ c

φsψ s + φcθ sψ c ? ? ?φsψ c + φcθ sψ s ? (5) ? φcθ c ?

The ‘s’ and ‘c’ subscript notations in (5) refer to sine and cosine, respectively. The φ, θ, and ψ notations are the roll, pitch, and yaw angles that describe the relative orientation of two coordinate frames, as well. Typically, to avoid duplicate sets of Euler angles representing the same attitude, the pitch angle θ is limited to the range of -90 ≤ θ ≤ 90 degrees. The IMU sensor measures specific force along the body frame axes. However, to apply in the velocity integration of navigation equations, it must be resolved about the same axes as the velocity. The axes are transformed by applying the coordinate transformation matrix as in (6).

v2 ? v1 = ∫ dv = ∫ adt

v1 t1

v2

t2

(1)

The total displacement position ?x during the small time interval is given by (2). It can be described by the change of position x as the time integral of velocity.

x2 ? x1 = ∫ dx = ∫ vdt

x1 t1

x2

t2

(2) The

fbn (t ) = Cbn (t ) fbb (t )

(6)

The velocity and position can be calculated from the initial time at 0 to . If and are the initial position and velocity at 0 respectively, the velocity and position can be combined as in (3).

in (6) is the acceleration in navigation

frame, and the is measured by the accelerometer of mobile robot body. To update the orientation of matrix in (7) rotates vectors from mobile robot, the the body frame to the navigation frame. It can be

978-89-93215-02-1 98560/10/$15 ?ICROS

305

obtained by using a cross product of skew matrix that obtains from angular rates in (8). This derivation is used as a small angle approximation to lower its computation.

3. EXPERIMENT AND RESULT ANALYSIS

The experiments were set to estimate the position of mobile robot with the IMU sensor. The mobile robot used in the experiment was the prototype of ET (Embedded Technology) rescue robot obtained in the laboratory [7]. The IMU mounted onto the mobile robot was aligned with the x, y and z axes, respectively. The mobile robot was being moved along the square path with turning angles of around 90 degree. The measurement data from the IMU was transferred to a PC (Personal Computer) via Bluetooth for data logging. Then, it was post processed by converting and reformatting before being processed to get the angular rate and acceleration at the PC. By acquiring the tracking mechanism as described in section 2, the MATLAB was employed to find and to analysis the results. Fig. 3 expresses the experimental environment of mobile robot in this work.

Cbn = Cbn [ωnb ]

? 0 ? [ωnb ] = ? ω z ? ?ω y ?

The , , and

(7)

?ω z 0

ωx

ωy ? ? ?ω x ?

0 ? ?

(8)

are the approximated angular

rates measured by the gyroscope, where the earth rotation velocity was neglected. 2.4 Inertial navigation system (INS) The experiment being considered in this paper is a dead reckoning solution, based on the measurement of inertial acceleration. It is also known as the INS (Inertial Navigation System). The INS technique uses the IMU that consists of accelerometer and gyroscope to estimate the attitude, velocity, and position. The INS comprises four steps, i.e., attitude update, transformation of specific-force resolving axes, and velocity and position updates. A brief summary of navigation equations used in the system can be expressed by using a block diagram of INS [6], as shown in Fig. 2. As in the figure, the attitude is updated by integrating the angular rate measurement. The velocity is updated by integrating the acceleration, and the position is updated by integrating the velocity. The suffixes (-) and (+) are applied to the values at the beginning and at the end of processing cycle of navigation equations, respectively.

Fig. 3 The IMU sensor installed on the mobile robot. The acceleration measurement from IMU sensors attached on the mobile robot was scaled by a standard gravitational constant (g). The received data from IMU contained much noise and some bias. Thus, the low-pass filter from MATLAB toolbox was applied. This process ensured that the data was suitable for the navigation equations. The results of acceleration measurement are shown in Fig. 4. To measure the angular rate from gyroscopes around the coordinate axes of sensors, the low-pass filter was applied in similar fashion as the acceleration data. It resulted from not only the limitation performance of inside commercial gyroscopes that the measured data included much noise, but also the gyro drift affected the angle error that was varied in time. The plotting results of angular rate of the mobile robot in coordinate system are expressed in Fig. 5.

Fig. 2 Diagram of Inertial Navigation System [6].

978-89-93215-02-1 98560/10/$15 ?ICROS

306

Another factor affecting the results might come from the adjustment of angular rate data by the scaling factor before being processed. If the calibration of scaling factor was not precise enough, it caused some errors in navigation algorithm. In addition, the gyroscope bias might be required to compensate the error, based on its correlation with the internal temperature.

Acceleration : Raw Data 0.4 0.2 m/sec 2 0 -0.2

Estimate Euler Angule 160

140

120

100

Angule (degree)

80

60

40

20

0

-20

0

10

20

30 Time (seconds)

40

50

60

Fig. 6 The estimated roll, pitch, and yaw by the DCM.

Velocity Estimate 0.5

-0.4

0

10

20

30

40

50

60

Acceleration : Filtered Data 0.05

0

-0.5

m/sec 2 0

0

10

20

30

40

50

60

0.5 m/sec 0 -0.5 -1 0.5 0 10 20 30 40 50 60

-0.05

0

10

20

30 Time (seconds)

40

50

60

Fig. 4 Raw acceleration data (top) and filtered data (bottom).

Roll Rate 100 0 -100

0

-0.5

0

10

20

30 Time (seconds)

40

50

60

Fig. 7 The estimated velocity on x, y, z axes.

0 10 20 30 Pitch Rate 40 50 60

Position Esitmate 4 2

Degrees/second

50 0

0

-50 0 10 20 30 Yaw Rate 200 0 -200 40 50 60

-2 0.1

0

10

20

30

40

50

60

m

0 10 20 30 Time (seconds) 40 50 60

0

-0.1 1

0

10

20

30

40

50

60

Fig. 5 Raw data and filter data of angular rate from the IMU sensor. Fig. 6 shows the results of Euler angles estimated by the DCM approach, and Fig. 7 depicts the results of estimated velocity on each axis. The experimental results did not use any compensation to the measurement data, as the intention to observe the behavior of IMU sensor. As the consequence, the errors were increased unbound. Fig 8 shows the plots of estimated position on each axis. As can be observed, due to the accumulated error results from previous estimations, the position errors are increased rapidly.

0.5

0

0

10

20

30 Time (seconds)

40

50

60

Fig. 8 The estimated position on x, y, z axes. Fig 9 expresses the plot of estimated tracking path of mobile robot deriving from the experiment. The plotting result reflects the robot movement along the square path with turning angles of around 90 degree, as in the experiment. As can be seen, the moving path is not straight on each direction along the square path as it should. It results from the accumulated errors as

978-89-93215-02-1 98560/10/$15 ?ICROS

307

mentioned. The error is increased along the moving path as robot movement. Thus, it is not applicable to track the robot position precisely as expected.

Path Tracking Mobile robot 9 8 7 6 position (m) 5 4 3 2 1 0 -1 -5

[3]

[4]

[5]

[6] [7]

0 5 position (m) 10 15 20

Conference, Jul 2008, pp. 343-348. B. Barshan and H. F. Durrant Whyte, “An inertial navigation for a mobile robot”, IEEE Transaction on Robotics and Automation, Vol. 11, No. 3, pp. 328-342, 1995. Spark Fun Electronics, IMU 6-DOF v2, Inertial sensor, http://www.robotshop.com/PDF/sparkfun6dof-v21-inertia-unit-manual.pdf, access on Oct 2010. N. Phuong, H. Kang, Y. Suh, and Y. Ro, “A DCM base orientation estimation algorithm with an inertial measurement unit and a magnetic compass”, Journal of Universal Computer Science, Vol. 15, No. 4, pp. 859-876, 2009. P. D. Groves, “Principle of GNSS, Inertial and Multisensor Integrated Navigation System”, Artech House, Boston, London, 2008. O. Wongwirat, S. Paelaong, and S. Homchoo, “ET rescue robot,” Proceedings of the 18th IEEE International Symposium on Robot and Human Interactive Communication, Sep 2009, p. 226.

Fig. 9 The estimated tracking path of mobile robot.

4. CONCLUSIONS

This paper described the experiment to verify the characteristics and feasibility of IMU sensor applied to the position tracking mechanism of mobile robot. The coordination transform using DCM and the INS were employed to estimate the tracking position by using MATLAB. The results of experiment expressed that applying the IMU to track the mobile robot position directly yielded inaccurate outcomes in terms of position and attitude estimation errors. The errors might be resulted from some parameters that were neglected during data processing, e.g., earth parameter, sensor bias, noise, and temperature compensation scaling factor. We are currently working to overcome the errors by investigating the appropriate methods, e.g., deploying our implemented Kalman filter to estimate some of the random errors, using the Zero velocity update (ZUPT) for ideal stopping the mobile robot time to time, and applying a realistic method to model the navigation characteristics. They will be reported in our future work.

REFERENCES

[1] J. Yi, J. Zhang, D. Song, and S. Jayasuriya, “IMU-based localization and slip estimation for skid-steered mobile robots”, Proceedings of the IEEE/RSJ International Conference on Intelligent Robotics and Systems, Nov 2007, pp. 2845-2850. M. A. Zmuda, A. Elesev, and Y. Morton, “Robot localization using RF and inertial sensors”, Proceedings of the Aerospace and Electronics

[2]

978-89-93215-02-1 98560/10/$15 ?ICROS

308