SIA OpenIR  > 机器人学研究室
Online structured sparse learning with labeled information for robust object tracking
Fan BJ(范保杰); Cong Y(丛杨); Tang YD(唐延东)
Department机器人学研究室
Source PublicationJournal of Electronic Imaging
ISSN1017-9909
2017
Volume26Issue:1Pages:1-16
Indexed BySCI ; EI
EI Accession number20170403276301
WOS IDWOS:000397059800037
Contribution Rank2
Funding OrganizationChina Postdoctoral Science Foundation (No. 2015M571785, 2016T90484), NSFC (U1613214, 61673254, 61533015) Jiangsu Postdoctoral Science Foundation (No. 1402085C), State Key Laboratory of Robotics Open Project and the Foundation for the Talent of Nanjing University of Tele. and Com. (No. NY215148), project supported by the open fund of Key Laboratory of Measurement and Control of Complex Systems of Engineering, Ministry of Education (No. MCCSE2015A05)
KeywordRobust Object Tracking Online Dictionary Learning And Updating Robust Sparse Coding Prior Information Joint Decision Metric
AbstractWe formulate object tracking under the particle filter framework as a collaborative tracking problem. The priori information from training data is exploited effectively to online learn a discriminative and reconstructive dictionary, simultaneously without losing structural information. Specifically, the class label and the semantic structure information are incorporated into the dictionary learning process as the classification error term and ideal coding regularization term, respectively. Combined with the traditional reconstruction error, a unified dictionary learning framework for robust object tracking is constructed. By minimizing the unified objective function with different mixed norm constraints on sparse coefficients, two robust optimizing methods are developed to learn the high-quality dictionary and optimal classifier simultaneously. The best candidate is selected by minimizing the reconstructive error and classification error jointly. As the tracking continues, the proposed algorithms alternate between the robust sparse coding and the dictionary updating. The proposed trackers are empirically compared with 14 state-of-the-art trackers on some challenging video sequences. Both quantitative and qualitative comparisons demonstrate that the proposed algorithms perform well in terms of accuracy and robustness.
Language英语
Citation statistics
Cited Times:2[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Identifierhttp://ir.sia.cn/handle/173321/19920
Collection机器人学研究室
Corresponding AuthorFan BJ(范保杰)
Affiliation1.Nanjing University of Posts and Telecommunications, Automation College, No. 9, Wenyuan Road, Nanjing, 210023, China
2.Chinese Academy of Sciences, State Key Laboratory of Robotics, No. 114, Nanta street, Shenyang, 110016, China
Recommended Citation
GB/T 7714
Fan BJ,Cong Y,Tang YD. Online structured sparse learning with labeled information for robust object tracking[J]. Journal of Electronic Imaging,2017,26(1):1-16.
APA Fan BJ,Cong Y,&Tang YD.(2017).Online structured sparse learning with labeled information for robust object tracking.Journal of Electronic Imaging,26(1),1-16.
MLA Fan BJ,et al."Online structured sparse learning with labeled information for robust object tracking".Journal of Electronic Imaging 26.1(2017):1-16.
Files in This Item:
File Name/Size DocType Version Access License
Online structured sp(3053KB)期刊论文作者接受稿开放获取ODC PDDLView Application Full Text
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Fan BJ(范保杰)]'s Articles
[Cong Y(丛杨)]'s Articles
[Tang YD(唐延东)]'s Articles
Baidu academic
Similar articles in Baidu academic
[Fan BJ(范保杰)]'s Articles
[Cong Y(丛杨)]'s Articles
[Tang YD(唐延东)]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Fan BJ(范保杰)]'s Articles
[Cong Y(丛杨)]'s Articles
[Tang YD(唐延东)]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: Online structured sparse learning with labeled information for robust object tracking.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.