Onboard Real-time Dense Reconstruction of Large-scale Environments for UAV


In this paper, we propose a GPU parallelizedSLAM system capable of using photometric and inertial datatogether with depth data from an active RGB-D sensor to buildaccurate dense 3D maps of indoor environments.

September 24, 2017
International Conference on Intelligent Robots and Systems (IROS) 2017



Anurag Vempati (Disney Research)

Igor Gilitschenski (ETH Zurich)

Juan Nieto (ETH Zurich)

Paul Beardsley (Disney Research)

Roland Siegwart (ETH Zurich)

Onboard Real-time Dense Reconstruction of Large-scale Environments for UAV


We describe several extensions to existing dense SLAM techniques that allow us to operate in real-time onboard memory constrained robotic platforms. Our primary contribution is a memory management algorithm that scales to large scenes without being limited byGPU memory resources. Moreover, by integrating a visual-inertial odometry system, we robustly track the camera pose even on an agile platform such as a quadrotor UAV. Our robust camera tracking framework can deal with fast camera motions and varying environments by relying on depth, color and inertial motion cues. Global consistency is achieved via regular checking for loop closures in conjunction with a pose graph, as a basis for corrective deformation of the 3D map.Our efficient SLAM system is capable of producing highly dense meshes up to 5mm resolution at rates close to 60Hz fully onboard a UAV. Experimental validations both in simulation and on a real-world platform, show that our approach is fast, more robust and more memory efficient than state-of-the-art techniques, while obtaining better or comparable accuracy.

Copyright Notice