Real-time vehicle localization and pose tracking in high-resolution 3D maps
In this paper we introduce a novel approach for accurate self-localization and pose tracking for Lidar and GPS-equipped autonomous vehicles (AVs) in high-density (more than 5000 points/m 2 ) 3D localization maps obtained through Mobile Laser Scanning (MLS). Our solution consist of two main steps: First, starting from a poor GPS-based initial position, we estimate the 3DoF pose (planar position and yaw orientation) of the ego vehicle by aligning its sparse (50-500 points/m 2 ) Lidar point cloud measurements to the MLS prior map, using a novel approach of matching static landmark objects of the scene. Second, to effectively deal with the lack of pairable objects in certain time frames (e.g. due to scene segments occluded by a large moving tram), we track the estimated 3DoF pose of the AVs by a Kalman filter. Comperative test are provided on roads with heavy traffic in downtown city areas with large (5-10 meters) GPS positioning errors. The proposed approach is able to reduce the location error of the vehicle by one order of magnitude and keep the yaw angle error around 1° during its whole trajectory without considerable drift, while running in real-time (20-25 Hz).