SIA OpenIR  > 空间自动化技术研究室
基于深度学习的手势识别及空间人机交互应用研究
Alternative TitleResearch on Hand Gesture Recognition Based on Deep Learning and Space Human-robot Interaction Application
高庆
Department空间自动化技术研究室
Thesis Advisor李杨民 ; 刘金国
Keyword人机交互 空间机器人 手势识别 深度学习
Pages128页
Degree Discipline机械电子工程
Degree Name博士
2019-11-20
Degree Grantor中国科学院沈阳自动化研究所
Place of Conferral沈阳
Abstract本文阐述了手势识别及空间人机交互技术的研究背景和重要意义,在相关理论学习和研究的基础上,对目前主流的手势识别技术和人机交互技术进行了深入研究和学习,提出了针对航天领域的宇航员手部检测、手势识别及空间人机交互等方法。结合深度学习和多模态融合的理论知识,进行了如下研究工作:针对基于深度学习的手部检测方面,将手部检测定位与人体骨架跟踪相结合,提出了一种用于双手检测和跟踪的方法,解决了在多种手势操作下对宇航员左、右手的高效识别和区分问题;针对基于表现特征的手势识别,提出了一种多流的深度神经网络结构,融合了手势的颜色、深度及肌电特征信息,实现了对多种手势更精确的识别和分类;针对基于姿态估计的手势识别方面,采用将手势检测与手势姿态估计方法相结合的方法,可实现更加精确、快速的手部姿态估计效果,再通过提取的骨架特征与颜色特征相融合的方式对手势进行识别,可实现更加准确的手势识别效果;此外,初步搭建了一套基于宇航员助手机器人的空间人机交互平台,并设计了一套空间人机交互手势,可实现宇航员通过手势控制机器人操作。本文的具体工作包括以下四个方面的研究内容:(1)基于深度学习的双手检测方法研究。考虑到人类在观察他人的手部时除了根据手部本身的特征之外还会结合人体的结构特征的特点,模拟人类观察检测手部的方式,将根据手部特征的手部检测定位和根据人体结构特征的手部检测定位相结合的方法来对人体的手部进行检测和定位。设计了一种并联双流的网络结构,并联的两个子网络分别采用改进的SSD网络和人体姿态估计方法提取手部本身的特征和人体结构的特征,再采用一定的融合方式将检测结果进行融合,得到最终的人的手部检测和定位结果。据我们所知,这是第一次采用该方法进行双手的检测和定位技术。(2)基于表现特征的手势识别方法研究。手势的表现特征指手势通过图像能显现出的特征,如手势表现出的颜色、形状、尺寸等特征。本文提出了一种新的手势识别方法,它采用多流深度融合结构。通过融合不同的手势表现特征信息,来实现手势识别率提高的效果。其中,对手势的彩色图像和深度图像进行融合,并结合手势检测方法,实现了对多种交互手势的识别。另外,提出了数据一致性方法,将不同空间结构的肌电信号和图像数据进行融合,实现了对不同手势的识别,并将此方法应用于机械手抓取的交互控制中,实现了通过手势对机械手的控制。(3)基于姿态估计的手势识别方法研究。对手势姿态的估计能够获得手势更精细的特征,有助于手势的识别。本文提出了一种基于姿态估计的手势识别方法,采用手势检测的方法来提高手势姿态估计的速度,并结合手势的姿态特征和表现特征,并采用3DCNN+ConvLSTM的方法对动态手势特征进行提取,可实现手势识别准确率的提高。(4)面向宇航员手势的空间机器人交互控制方法研究。针对空间机器人交互控制系统需求,设计了一款用于空间站舱内的宇航员助手机器人。并针对实现对该机器人的控制设计了一套基于手势的人机交互系统平台。针对该平台,对交互手势集进行设计,并对宇航员手势、宇航员助手机器人、视觉传感器坐标系的建立以及机器人运动的动力学和运动学进行研究。此外,对于手势人机交互系统的稳定性,设计了一种基于有限状态机的动态映射方法。从多个方面实现了从宇航员手势操作到机器人运动控制的转化过程。以上四个方面的研究,为面向空间机器人的手势识别和人机交互提供了一定的理论基础和技术支持。
Other AbstractThis paper describes the significance and research background of hand gesture recognition and space HRI. On the basis of relevant theoretical study and research, the current mainstream hand gesture recognition technology and HRI technology have been deeply studied, and methods for astronaut hand detection, hand gesture recognition and space HRI for aerospace field are proposed. Combining the theoretical knowledge of deep learning and multimodal fusion, the following research work was carried out: (a) For the deep learning-based hand detection, combining the hand detection with the human pose estimation, a method for two-hand detection and localization is proposed, which solves the efficient identification and differentiation of left and right hands of the astronauts under various hand gesture operations. (b) A multi-stream deep neural network framework is proposed for hand gesture recognition based on performance features. It combines the color, depth and EMG features of hand gestures to achieve more accurate recognition and classification of multiple hand gestures. (c) For hand gesture recognition based on pose estimation, combining hand gesture detection with hand pose estimation method can achieve more accurate and faster hand pose estimation. Then, the hand gestures are recognized by the extracted skeleton feature and the color feature, which can achieve more accurate hand gesture recognition effect. (d) In addition, a space HRI platform based on an astronaut assistant robot is initially constructed, and a set of space HRI gestures is designed to enable the astronauts to control the robot operation through hand gestures. The specific work of this paper includes the following four aspects: (1) Research on dual-hand detection method based on deep learning. Considering that when humans observe other people's hands, they focus not only on the characteristics of the hand itself but also on the structural features of the human body, we simulate the way of humans to deal with the hand detection and localization by combining the results of human hand detection and human body pose estimation. As a result, a parallel and dual-stream network framework is designed. The parallel two sub-networks apply an improved SSD and a human body pose estimation method to extract the characteristics of the hand itself and the characteristics of the human body structure, respectively. Then, the results are fused by using a fusion method. So, the final human hand detection and localization results are gotten. To the best of our knowledge, this is the first time to use this kind of method to detect and locate dual hands. (2) Research on hand gesture recognition method based on performance feature. The performance feature of hand gestures refers to features that hand gestures can appear through images, such as hand gestures’ color, shape and size. This paper proposes a new hand gesture recognition method, which uses a multi-stream depth fusion framework. It can improve the recognition accuracy of hand gestures by fusing different performance features of hand gestures. Among them, the color images and the depth images of the hand gestures are merged, and hand detection method is combined to realize the recognition of multiple HRI hand gestures. In addition, a data consistency method is proposed to fuse the EMG signals and image data which have different spatial structures. And it is used to realize the recognition of different hand gestures. The method is applied to the interactive control of the manipulator grabbing through hand gestures. (3) Research on hand gesture recognition method based on pose estimation. The estimation of hand pose can obtain more detailed features of hand gestures and this can help to recognize hand gestures. This paper proposes a hand gesture recognition method based on pose estimation. It uses hand detection to improve the speed and accuracy of pose estimation, and it combines the pose feature and performance feature of hand gestures. Then, the 3DCNN+ConvLSTM method is used to extract dynamic hand gesture features, which can improve the accuracy of hand gesture recognition. (4) Research on space robot interactive control method based on astronauts' hand gestures. For the interactive control of space robots, an astronaut assistant robot used in space station cabin is designed. And we design a hand gesture-based HRI system platform for the robot's control. Focus on this platform, the interactive hand gesture set is designed, the coordinate systems of astronaut hands, the astronaut assistant robot and visual sensor are established, and the dynamics and kinematics of the robot motion are analyzed. In addition, for the stability of hand gesture-based HRI system, a dynamic map based on finite state machine is designed. The process from astronaut hand gesture operation to robot motion control is realized in many aspects. The above four aspects of research provide a certain theoretical basis and technical support for hand gesture recognition and human-robot interaction for space robots.
Language中文
Contribution Rank1
Document Type学位论文
Identifierhttp://ir.sia.cn/handle/173321/25939
Collection空间自动化技术研究室
Recommended Citation
GB/T 7714
高庆. 基于深度学习的手势识别及空间人机交互应用研究[D]. 沈阳. 中国科学院沈阳自动化研究所,2019.
Files in This Item:
File Name/Size DocType Version Access License
基于深度学习的手势识别及空间人机交互应用(5992KB)学位论文 开放获取CC BY-NC-SAApplication Full Text
Related Services
Recommend this item
Bookmark
Usage statistics
Export to Endnote
Google Scholar
Similar articles in Google Scholar
[高庆]'s Articles
Baidu academic
Similar articles in Baidu academic
[高庆]'s Articles
Bing Scholar
Similar articles in Bing Scholar
[高庆]'s Articles
Terms of Use
No data!
Social Bookmark/Share
All comments (0)
No comment.
 

Items in the repository are protected by copyright, with all rights reserved, unless otherwise indicated.