- 地址:四川省成都市玉双路10号
- 邮箱:zgcsjs@163.com(编辑部) zgcs8440@nimtt.com(综合发展部)
- 电话:86-28-84404872 84403677 84406505 84404886(编辑部) 86-28-84404108 84406812(综合发展部)丨86-28-84403600 84406307(新闻中心)
- 传真:86-28-84403677
发布时间:2022-07-01浏览量:1994
作者:何孔飞, 熊鹏文, 童小宝 作者单位:南昌大学信息工程学院,江西 南昌 330031
Joint group kernel sparse coding based multimodal material perception and identification method
HE Kongfei, XIONG Pengwen, TONG Xiaobao
School of Information Engineering, Nanchang University, Nanchang 330031, China
Abstract: Machine vision-based methods are often difficult to distinguish materials with highly similar appearance. Therefore, it is necessary to overcome the defects of visual methods by integrating other modal information. In order to solve this problem, firstly, a series of appropriate similarity evaluation methods are introduced according to the properties of each modality. Secondly, the joint group kernel sparse coding method is used to fuse the inseparability multimodal data, and the solution method of this model is introduced in detail. Finally, a comparison experiment was conducted on the public data set containing 184 materials. The experimental results showed that the accuracy of coarse, medium and fine classification of the used fusion framework is 90.8%, 76.6% and 73.4%, respectively, which could significantly improve the recognition effect compared with the visual method.
Keywords: pattern recognition and intelligent system;sparse joint group lasso;multi-modality fusion;material classification
2020, 46(12):129-134 收稿日期: 2020-11-17;收到修改稿日期: 2020-12-02
基金项目: 国家自然科学基金资助项目(61903175,61663027);江西省主要学科学术和技术带头人项目(20204BCJ23006);江西省研究生创新专项资金资助项目(YC2019-S011,YC2020-S101)
作者简介: 何孔飞(1994-),男,安徽铜陵市人,硕士研究生,专业方向为智能机器人
参考文献
[1] SHARAN L, LIU C, ROSENHOLTZ R, et al. Recognizing materials using perceptually inspired features[J]. International Journal of Computer Vision, 2013, 103(3): 348-371
[2] JAMALI N, SAMMUT C. Majority voting: material classification by tactile sensing using surface texture[J]. IEEE Transactions on Robotics, 2011, 27(3): 508-521
[3] ASHBY M, SHERDIFF H, CEBON D. Materials: engineering, science, processing and design[J]. Materials Today, 2007, 10(5): 59-67
[4] KUCHENBECKER K J, ROMANO J, MCMAHAN W. Haptography: capturing and recreating the rich feel of real surfaces[J]. Robotics and Automation, 2011, 30(5): 245-260
[5] LEDERMAN S J, KLATZKY R L. Haptic perception: a tutorial[J]. Attention, Perception, & Psychophysics, 2009, 71(7): 1439-1459
[6] HUSSAIN M S, CALVO R A, POUR P A. Hybrid fusion approach for detecting affects from multichannel physiology[C]//Affective Computing and Intelligent Interaction-4th International Conference, 2011: 568-577.
[7] YUHAS B P, JR M H G, SEJNOWSKI T J. Integration of acoustic and visual speech signals using neural networks[J]. IEEE Communications Magazine, 1989, 27(11): 65-71
[8] ATREY P K, HOSSAIN M A, SADDIK A E, et al. Multimodal fusion for multimedia analysis: a survey[J]. Multimedia Systems, 2010, 16(6): 345-379
[9] TIBSHIRANI R. Regression shrinkage and selection via the lasso[J]. Journal of the royal statal society, series B, 1996, 58(1): 267-288
[10] ZHENG Z D, YUAN H G, ZHANG J Y. Multitarget localization based on sparse representation for bistatic mimo radar in the presence of impulsive noise[J]. Journal of Electronics & Information Technology, 2014, 36(12): 3001-3007
[11] YANG X F, CHENG Y Y. Face hallucination via compressive sensing[J]. Journal of Measurement Science and Instrumentation, 2016, 7(2): 149-154
[12] 郑志东, 袁红刚, 张剑云. 冲击噪声背景下基于稀疏表示的双基地MIMO雷达多目标定位[J]. 电子与信息学报, 2014, 36(12): 3001-3007
[13] TIAN X, WANG X, CHEN J. Network-constrained group lasso for high-dimensional multinomial classification with application to cancer subtype prediction[J]. Cancer Informatics, 2014, 13(6): 25-33
[14] SUN Y, WANG H J, FUENTES M. Fused adaptive lasso for spatial and temporal quantile function estimation[J]. Technometrics, 2015, 58(1): 127-137
[15] 陈功, 常睿, 于海平, 等. 正弦函数基原子库微弱被动鱼声信号的稀疏检测[J]. 中国测试, 2015, 41(3): 108-112
[16] LIU H, YU Y, SUN F, et al. Visual-tactile fusion for object recognition[J]. IEEE Transactions on Automation Ence and Engineering, 2017, 14(2): 996-1008
业内最新资讯动态 请关注微信公众号