Global Localization for Urban AR
Projects
Authors: Thomas Calloway, Dalila B. Megherbi, Hongsheng Zhang
Authors: Thomas Calloway, Dalila B. Megherbi, Hongsheng Zhang 2017
Reliably localizing and tracking displays moving relative to content in the physical world is one of the primary technical challenges facing all augmented reality systems. While significant progress has been made in recent years, all approaches remain limited to functioning only in certain environments and situations. Attempts to improve solution generality via additional sensors (e.g., depth sensors, multiple cameras) add significant size, weight and power to wearable solutions sensitive to these attributes. In this work, we propose an approach to tracking and localization using a single camera and inertial chip. Through a combination of visual-inertial navigation, point cloud mapping and dynamically correlating building faces and edges with sparse OpenStreetMap datasets, we achieved a typical global localization precision of less than 0.25 meters and 1 degree heading relative to the map. All motion tracking calculations are performed on a local mobile device with less than 10 milliseconds of latency while global localization and drift correction is performed remotely.
Downloads: