SIA OpenIR  > 空间自动化技术研究室
An Occlusion-Aware Framework for Real-Time 3D Pose Tracking
Fu ML(付明亮)1,2; Leng YQ(冷雨泉)3; Luo HT(骆海涛)1; Zhou WJ(周维佳)1
Department空间自动化技术研究室
Source PublicationSensors (Switzerland)
ISSN1424-8220
2018
Volume18Issue:8Pages:1-20
Indexed BySCI ; EI
EI Accession number20183505743776
WOS IDWOS:000445712400334
Contribution Rank1
Funding OrganizationNational Science Foundation of China
KeywordPose Tracking Occlusion Handling Online Rendering Motion Compensation
Abstract

Random forest-based methods for 3D temporal tracking over an image sequence have gained increasing prominence in recent years. They do not require object’s texture and only use the raw depth images and previous pose as input, which makes them especially suitable for textureless objects. These methods learn a built-in occlusion handling from predetermined occlusion patterns, which are not always able to model the real case. Besides, the input of random forest is mixed with more and more outliers as the occlusion deepens. In this paper, we propose an occlusion-aware framework capable of real-time and robust 3D pose tracking from RGB-D images. To this end, the proposed framework is anchored in the random forest-based learning strategy, referred to as RFtracker. We aim to enhance its performance from two aspects: integrated local refinement of random forest on one side, and online rendering based occlusion handling on the other. In order to eliminate the inconsistency between learning and prediction of RFtracker, a local refinement step is embedded to guide random forest towards the optimal regression. Furthermore, we present an online rendering-based occlusion handling to improve the robustness against dynamic occlusion. Meanwhile, a lightweight convolutional neural network-based motion-compensated (CMC) module is designed to cope with fast motion and inevitable physical delay caused by imaging frequency and data transmission. Finally, experiments show that our proposed framework can cope better with heavily-occluded scenes than RFtracker and preserve the real-time performance.

Language英语
WOS SubjectChemistry, Analytical ; Electrochemistry ; Instruments & Instrumentation
WOS KeywordOBJECT TRACKING ; LIBRARY
WOS Research AreaChemistry ; Electrochemistry ; Instruments & Instrumentation
Funding ProjectNational Science Foundation of China[51505470]
Citation statistics
Document Type期刊论文
Identifierhttp://ir.sia.cn/handle/173321/22395
Collection空间自动化技术研究室
Corresponding AuthorFu ML(付明亮)
Affiliation1.State Key Laboratory of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang 110016, China
2.University of Chinese Academy of Sciences, Beijing 100049, China
3.Department of Mechanical and Energy Engineering, Southern University of Science and Technology, Shenzhen 518055, China
Recommended Citation
GB/T 7714
Fu ML,Leng YQ,Luo HT,et al. An Occlusion-Aware Framework for Real-Time 3D Pose Tracking[J]. Sensors (Switzerland),2018,18(8):1-20.
APA Fu ML,Leng YQ,Luo HT,&Zhou WJ.(2018).An Occlusion-Aware Framework for Real-Time 3D Pose Tracking.Sensors (Switzerland),18(8),1-20.
MLA Fu ML,et al."An Occlusion-Aware Framework for Real-Time 3D Pose Tracking".Sensors (Switzerland) 18.8(2018):1-20.
Files in This Item: Download All
File Name/Size DocType Version Access License
An Occlusion-Aware F(4208KB)期刊论文出版稿开放获取CC BY-NC-SAView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Fu ML(付明亮)]'s Articles
[Leng YQ(冷雨泉)]'s Articles
[Luo HT(骆海涛)]'s Articles
Baidu academic
Similar articles in Baidu academic
[Fu ML(付明亮)]'s Articles
[Leng YQ(冷雨泉)]'s Articles
[Luo HT(骆海涛)]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Fu ML(付明亮)]'s Articles
[Leng YQ(冷雨泉)]'s Articles
[Luo HT(骆海涛)]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: An Occlusion-Aware Framework for Real-Time 3D Pose Tracking.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.