SIA OpenIR  > 光电信息技术研究室
RGB-D dense SLAM with keyframe-based method
Fu XY(付兴银)1,2,3,4; Zhu F(朱枫)1,3,4; Wu QX(吴清潇)1,3,4; Sun YL(孙云雷)1,2,3,4
Conference NameInternational Symposium on Optoelectronic Technology and Application, OTA 2018
Conference DateMay 22-24, 2018
Conference PlaceBeijing, China
Author of SourceChinese Society for Optical Engineering (CSOE) ; Division of Information and Electronic Engineering of Chinese Academy of Engineering
Source PublicationProceedings of SPIE 10845, Optical Sensing and Imaging Technologies and Applications
Publication PlaceBellingham, USA
Indexed ByEI ; CPCI(ISTP)
EI Accession number20185206309036
WOS IDWOS:000455327800018
Contribution Rank1
Keyworddense SLAM RGB-D camera GPU, keyframe surfel
AbstractCurrently, feature-based visual Simultaneous Localization and Mapping (SLAM) has reached a mature stage. Feature-based visual SLAM systems usually calculate the camera poses without producing a dense surface, even if a depth camera are provided. In contrast, dense SLAM systems simultaneously output camera poses as well as a dense surface of the reconstruction region. In this paper, we propose a new RGB-D dense SLAM system. First, camera pose is calculated by minimizing the combination of the reprojection error and the dense geometric error. We construct a new type of edge in g2o, which adds the extra constraints built with the dense geometric error to the graph optimization. The cost function is minimized in a coarse-to-fine strategy with GPU which contributes to the enhancement of system frame rate and promotion of large camera motion convergence. Second, in order to generate dense surfaces and provide users with a feedback of the scanned surfaces, we use the surfel model to fuse RGB-D streams and generated dense surface models in real-time. The surfels in the dense model are updated with embedded deformation graph to keep them consistent with the optimized camera poses after the system performs essential graph optimization and full Bundle Adjustment (BA). Third, a better 3D model is achieved by re-merging the stream with the optimized camera poses when the user ends the reconstruction. We compare the accuracy of generated camera trajectories and reconstruction surfaces with the state-of-the-art systems based on the TUM and ICL-NIUM RGB-D benchmark datasets. Experimental results show that the accuracy of dense surfaces produced online is very close to that of later re-fusion. And our system produces better results than the state-of-the-art systems in terms of the accuracy of the produced camera trajectories. © 2018 SPIE.
Citation statistics
Document Type会议论文
Corresponding AuthorFu XY(付兴银)
Affiliation1.Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China
2.University of Chinese Academy of Sciences, Beijing 100049, China
3.Key Laboratory of Opto-Electronic Information Processing, CAS, Shenyang 110016, China
4.Key Lab of Image Understanding and Computer Vision, Liaoning Province, Shenyang 110016, China
Recommended Citation
GB/T 7714
Fu XY,Zhu F,Wu QX,et al. RGB-D dense SLAM with keyframe-based method[C]//Chinese Society for Optical Engineering (CSOE), Division of Information and Electronic Engineering of Chinese Academy of Engineering. Bellingham, USA:SPIE,2018:1-11.
Files in This Item: Download All
File Name/Size DocType Version Access License
RGB-D dense SLAM wit(6919KB)会议论文 开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Fu XY(付兴银)]'s Articles
[Zhu F(朱枫)]'s Articles
[Wu QX(吴清潇)]'s Articles
Baidu academic
Similar articles in Baidu academic
[Fu XY(付兴银)]'s Articles
[Zhu F(朱枫)]'s Articles
[Wu QX(吴清潇)]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Fu XY(付兴银)]'s Articles
[Zhu F(朱枫)]'s Articles
[Wu QX(吴清潇)]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: RGB-D dense SLAM with keyframe-based method.pdf
Format: Adobe PDF
All comments (0)
No comment.

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.