SIA OpenIR  > 光电信息技术研究室
A method to enhance images baseed on human vision property
Cai TF(蔡铁峰); Hao YM(郝颖明); Wu QX(吴清潇); Zhu F(朱枫)
作者部门光电信息技术研究室
会议名称Proceedings of IEEE 11th International Conference on Signal Processing (ICSP 2012)
会议日期October 21-25, 2012
会议地点Beijing, China
会议主办者IEEE
会议录名称Proceedings of IEEE 11th International Conference on Signal Processing
出版者IEEE
出版地New York, USA
2012
页码952-955
收录类别EI
EI收录号20131716242988
产权排序1
ISBN号978-1-4673-2197-6
关键词Human Vision Property Grayscale Value Difference Perception Degree
摘要

Image information for target detection and recogniton is determined by grayscale value relationship between pixels. The human eye is unable to perceive too small nonzone grayscale value difference. Therefore human eyes may not always perceive the whole image information for target detecting and recognizing. To enhance image information for human detecting and recognizing targets, a method is proposed in this paper to nonlinearly map the grayscale values of pixels in the orignal image to the new grayscale values of pixels in the enhanced image guided by human vision property.

语种英语
文献类型会议论文
条目标识符http://ir.sia.cn/handle/173321/10172
专题光电信息技术研究室
作者单位1.Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, China
2.Graduate School of the Chinese Academy of Sciences, Beijing, China
3.Key Laboratory Opto-Electronic Information Processing, Chinese Academy of Sciences, Shenyang, China
推荐引用方式
GB/T 7714
Cai TF,Hao YM,Wu QX,et al. A method to enhance images baseed on human vision property[C]//IEEE. New York, USA:IEEE,2012:952-955.
条目包含的文件 下载所有文件
文件名称/大小 文献类型 版本类型 开放类型 使用许可
A Method to Enhance (241KB)会议论文 开放获取CC BY-NC-SA浏览 下载
个性服务
推荐该条目
保存到收藏夹
查看访问统计
导出为Endnote文件
谷歌学术
谷歌学术中相似的文章
[Cai TF(蔡铁峰)]的文章
[Hao YM(郝颖明)]的文章
[Wu QX(吴清潇)]的文章
百度学术
百度学术中相似的文章
[Cai TF(蔡铁峰)]的文章
[Hao YM(郝颖明)]的文章
[Wu QX(吴清潇)]的文章
必应学术
必应学术中相似的文章
[Cai TF(蔡铁峰)]的文章
[Hao YM(郝颖明)]的文章
[Wu QX(吴清潇)]的文章
相关权益政策
暂无数据
收藏/分享
文件名: A Method to Enhance Images Based on Human Vision Property.pdf
格式: Adobe PDF
所有评论 (0)
暂无评论
 

除非特别说明,本系统中所有内容都受版权保护,并保留所有权利。