You are here

ECE MS Defense: "Indoor and Outdoor Robot Localization using Sensor Data Fusion," Md Maruf Hasan

18
Aug

442 Dana

August 18, 2017 9:30 am
August 18, 2017 9:30 am

Abstract:
One of the major challenges of mobile robot navigation is to know the robot’s position and orientation with respect to an environment. This is known as robot localization. In order to get an accurate pose estimation, we cannot rely only on measurements from a single sensor source. Such sensor sources are susceptible to noise and introduces errors in the measurement. The error accumulates with time leading to a wrong estimate of the pose of the robot. To overcome this problem, fusion of sensor data coming from multiple sensors can be used to locate the robot in a known or unknown environment.

To calculate the pose of our mobile robot we use sensor information from incremental wheel encoders, Inertial Measurement Unit (IMU) and a GPS. We get information about how much the wheel encoders are rotating and extract odometry information. The IMU provides orientation, linear and angular velocity and linear acceleration and the GPS provides absolute position information in longitude and latitude. These sensor measurements along with their corresponding errors are fused and fed into an Extended Kalman Filter. The Extended Kalman filter uses these information and weighs between incoming measurements and its state prediction to get the most accurate estimate of the robot’s current position compared to a single sensor used in isolation.

  • Professor Taskin Padir (Advisor)
  • Professor Hanumant Singh
  • Professor Aatmesh Shrivastava