SIA OpenIR  > 机器人学研究室
Onboard monocular pedestrian detection by combining spatio-temporal hog with structure from motion algorithm
Hua CS(华春生); Makihara, Yasushi; Yagi, Yasushi; Iwasaki, Shun; Miyagawa, Keisuke; Li, Bo
Department机器人学研究室
Source PublicationMachine Vision and Applications
ISSN0932-8092
2015
Volume26Issue:2-3Pages:161-183
Indexed BySCI ; EI
EI Accession number20150200417160
WOS IDWOS:000351462900002
Contribution Rank1
KeywordSpatio-temporal Hog Pedestrian Detection Onboard Monocular Camera Structure From Motion
AbstractIn this paper, we brought out a novel pedestrian detection framework for the advanced driver assistance system of mobile platform under the normal urban street environment. Different from the conventional systems that focus on the pedestrian detection at near distance by interfusing multiple sensors (such as radar, laser and infrared camera), our system has achieved the pedestrian detection at all (near, middle and long) distance on a normally driven vehicle (1–40 km/h) with monocular camera under the street scenes. Since pedestrians typically exhibit not only their human-like shape but also the unique human movements generated by their legs and arms, we use the spatio-temporal histogram of oriented gradient (STHOG) to describe the pedestrian appearance and motion features. The shape and movement of a pedestrian will be described by a unique feature produced by concatenating the spatial and temporal histograms. A STHOG detector trained by the AdaBoost algorithm will be applied to the images stabilized by the structure from motion (SfM) algorithm with geometric ground constraint. The main contributions of this work include: (1) ground constraint with monocular camera to reduce the computational cost and false alarms; (2) preprocessing by stabilizing the successive images captured from mobile camera with the SfM algorithm; (3) long-distance (maximum 100 m) pedestrian detection at various velocities (1–40 km/h). Through the extensive experiments under different city scenes, the effectiveness of our algorithm has been proved.
Language英语
Citation statistics
Cited Times:4[WOS]   [WOS Record]     [Related Records in WOS]
Document Type期刊论文
Identifierhttp://ir.sia.cn/handle/173321/15663
Collection机器人学研究室
Corresponding AuthorHua CS(华春生)
Affiliation1.The State Key Lab of Robotics, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, Liaoning, China
2.Department of Intelligent Multimedia, ISIR of Osaka University, Osaka, Japan
3.Honda R&D Co., Ltd, Wako, Japan
Recommended Citation
GB/T 7714
Hua CS,Makihara, Yasushi,Yagi, Yasushi,et al. Onboard monocular pedestrian detection by combining spatio-temporal hog with structure from motion algorithm[J]. Machine Vision and Applications,2015,26(2-3):161-183.
APA Hua CS,Makihara, Yasushi,Yagi, Yasushi,Iwasaki, Shun,Miyagawa, Keisuke,&Li, Bo.(2015).Onboard monocular pedestrian detection by combining spatio-temporal hog with structure from motion algorithm.Machine Vision and Applications,26(2-3),161-183.
MLA Hua CS,et al."Onboard monocular pedestrian detection by combining spatio-temporal hog with structure from motion algorithm".Machine Vision and Applications 26.2-3(2015):161-183.
Files in This Item: Download All
File Name/Size DocType Version Access License
Onboard monocular pe(14225KB)期刊论文出版稿开放获取ODC PDDLView Download
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[Hua CS(华春生)]'s Articles
[Makihara, Yasushi]'s Articles
[Yagi, Yasushi]'s Articles
Baidu academic
Similar articles in Baidu academic
[Hua CS(华春生)]'s Articles
[Makihara, Yasushi]'s Articles
[Yagi, Yasushi]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[Hua CS(华春生)]'s Articles
[Makihara, Yasushi]'s Articles
[Yagi, Yasushi]'s Articles
Terms of Use
No data!
Social Bookmark/Share
File name: Onboard monocular pedestrian detection by combining spatio-temporal hog with structure from motion algorithm.pdf
Format: Adobe PDF
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.