Back to Browse

DSO: Direct Sparse Odometry

115.4K views
Jul 14, 2016
5:09

DSO: Direct Sparse Odometry Jakob Engel, Vladlen Koltun, Daniel Cremers July 2016 DSO Paper: http://arxiv.org/abs/1607.02565 DSO Website: http://vision.in.tum.de/dso DSO Code: https://github.com/JakobEngel/dso (released November 2016) Dataset Paper: https://arxiv.org/abs/1607.02555 Dataset Website: https://vision.in.tum.de/mono-dataset Dataset Code: https://github.com/tum-vision/mono_dataset_code We propose a novel direct sparse visual odometry formulation. It combines a fully direct probabilistic model (minimizing a photometric error) with consistent, joint optimization of all model parameters, including geometry -- represented as inverse depth in a reference frame -- and camera motion. This is achieved in real time by omitting the smoothness prior used in other direct methods and instead sampling pixels evenly throughout the images. Since our method does not depend on keypoint detectors or descriptors, it can naturally sample pixels from across all image regions that have intensity gradient, including edges or smooth intensity variations on mostly white walls. The proposed model integrates a full photometric calibration, accounting for exposure time, lens vignetting, and non-linear response functions. We thoroughly evaluate our method on three different datasets comprising several hours of video. The experiments show that the presented approach significantly outperforms state-of-the-art direct and indirect methods in a variety of real-world settings, both in terms of tracking accuracy and robustness.

Download

0 formats

No download links available.

DSO: Direct Sparse Odometry | NatokHD