浏览全部资源
扫码关注微信
陆军工程大学石家庄校区 纳米技术与微系统实验室,河北 石家庄 050003
[ "王广龙 (1964-),男,山东泗水人,教授,博士生导师,1986年于军械工程学院获得学士学位,1989年于华北工学院获得硕士学位,1999年于北京理工大学获得博士学位,主要从事嵌入式系统和机器视觉等方面的研究。E-mail:ysdmhj@163.com" ]
[ "田杰 (1990-),女,黑龙江肇东人,博士研究生,2013年于哈尔滨工业大学获得学士学位,2015年于军械工程学院获得硕士学位,主要从事计算机视觉、目标检测与跟踪方面的研究。E-mail: 985459288@qq.com" ]
收稿日期:2018-10-16,
录用日期:2018-12-14,
纸质出版日期:2019-05-15
移动端阅览
王广龙, 田杰, 朱文杰, 等. 特征融合和自适应权重更新相结合的运动模糊目标跟踪[J]. 光学 精密工程, 2019,27(5):1158-1166.
Guang-long WANG, Jie TIAN, Wen-jie ZHU, et al. Feature fusion and weight adaptive updating based motion blur object tracking[J]. Optics and precision engineering, 2019, 27(5): 1158-1166.
王广龙, 田杰, 朱文杰, 等. 特征融合和自适应权重更新相结合的运动模糊目标跟踪[J]. 光学 精密工程, 2019,27(5):1158-1166. DOI: 10.3788/OPE.20192705.1158.
Guang-long WANG, Jie TIAN, Wen-jie ZHU, et al. Feature fusion and weight adaptive updating based motion blur object tracking[J]. Optics and precision engineering, 2019, 27(5): 1158-1166. DOI: 10.3788/OPE.20192705.1158.
针对目标跟踪任务中由于摄像机或目标运动而产生的目标模糊问题,基于ECO_HC算法提出了方向梯度-色度饱和度直方图(Histogram of Oriented Gradient and Hue Saturation
HOGHS)和Zernike矩特征相结合的运动模糊目标跟踪算法。首先,结合fHOG和颜色构造了HOGHS特征,介绍了Zernike矩性质,并结合HOGHS和Zernike矩实现了目标的特征表达;然后,提出了一种兼顾定位精度与鲁棒性的响应图质量评估方法,并基于该方法实现了HOGHS和Zernike矩特征权重的自适应融合。将本文算法与其他四种先进跟踪算法在OTB-100测评集运动模糊图像序列中进行验证,本文算法的精确度与成功率分别为0.849与0.827,帧率达38.4 frame/s,同等条件下,本文算法较在VOT-2016上表现优秀的ECO_HC于Pre-20和AUC指标上分别提升了2.3%与2.4%。实验结果表明本文算法能够有效地完成运动模糊目标跟踪任务。
This study presents a motion blur object tracking method which combines the histogram of oriented gradient and hue saturation (HOGHS) with the Zernike moments features based on ECO_HC. This method was proposed to address object blur caused by the motion of the camera or the object itself. HOGHS was constructed by integrating fHOG with color features
and the properties of Zernike moments were introduced. The object was represented by combining HOGHS and Zernike moments. Furthermore
a novel quality evaluation method of response map was proposed that considers both positioning accuracy and robustness. Based on this method
an adaptive fusion strategy that leverages the complementary properties of HOGHS and Zernike moments was implemented. Experiments were performed on the motion blur sequences from the OTB-100 datasheet. Our method was compared to four existing state-of-the-art methods. Results show that the precision and success rate are 0.849 and 0.827
respectively
and the frame rate is 38.4 frame/s. The proposed method outperformed the top performing tracker from VOT-2016
namely
ECO_HC
with a relative gain of 2.3% in Pre-20 and 2.4% in area under the curve. The results show that the proposed method can effectively achieve the goal of motion blur object tracking.
WU Y, LIM J, YANG M. Online object tracking: a benchmark[C]. IEEE Conference on Computer Vision and Pattern Recognition , 2013: 2411-2418.
李勇, 杨德东, 毛宁, 等.基于卷积神经网络的响应自适应跟踪[J].液晶与显示, 2018, 33(7):596-605.
LI Y, YANG D D, MAO N, et al .. Response adaptive tracking based on convolution neural network[J]. Chinese Journal of Liquid Crystals and Displays , 2018, 33(7): 596-605. (in Chinese)
NAH S, KIM T H, LEE K M. Deep multi-scale convolutional neural network for dynamic scene deblurring[C]. IEEE Conference on Computer Vision and Pattern Recognition , 2017: 257-265.
SHI W, CABALLERO J, HUSZAR F, et al .. Real-time single image and video super-resolution using an efficient sub-pixel convolutional neural network[C]. IEEE Conference on Computer Vision and Pattern Recognition , 2016: 1874-1883.
ZITA A, FLUSSER J, SUK T, et al .. Feature selection on affine moment invariants in relation to known dependencies[J]. International Conference on Computer Analysis of Images & Patterns , 2017: 285-295.
刘教民, 郭剑威, 师硕.自适应模板更新和目标重定位的相关滤波器跟踪[J].光学 精密工程, 2018, 26(8):2100-2111.
LIU J M, GUO J W, SHI SH. Correlation filter tracking based on adaptive learning rate and location refiner[J]. Opt. Precision Eng. , 2018, 26(8): 2100-2111. (in Chinese)
HENRIQUES J F, RUI C, MARTINS P, et al .. Exploiting the circulant structure of tracking-by-detection with kernels[J]. Lecture Notes in Computer Science , 2012, 7575(1): 702-715.
HENRIQUES J F, RUI C, MARTINS P, et al .. High-Speed tracking with kernelized correlation filters[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence , 2015, 37(3): 583-596.
BERTINETTO L, VALMADRE J, GOLODETZ S, et al .. Staple: complementary learners for real-time tracking[C]. Computer Vision and Pattern Recognition , 2016: 1401-1409.
VALMADRE J, BERTINETTO L, HENRIQUES J F, et al .. End-to-end representation learning for correlation filter based tracking[C]. Computer Vision and Pattern Recognition , 2017: 5000-5008.
MARTIN D, FAHAD S K, MICHAEL F, et al .. Adaptive color attributes for real-time visual tracking[C]. IEEE Conference on Computer Vision and Pattern Recognition , 2014: 1090-1097.
DANELLJAN M, HAGER G, KHAN F S, et al .. Discriminative scale space tracking[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence , 2017, 39(8):1561-1575.
DANELLJAN M, HAGER G, KHAN F S, et al .. Adaptive decontamination of the training set: a unified formulation for discriminative visual tracking[C]. Computer Vision and Pattern Recognition , 2016: 1430-1438.
杨德东, 毛宁, 杨福才, 等.利用最佳伙伴相似性的改进空间正则化判别相关滤波目标跟踪[J].光学 精密工程, 2018, 26(2):492-502.
YANG D D, MAO N, YANG F C, et al .. Improved SRDCF object tracking via the Best-Buddies similarity[J]. Opt. Precision Eng. , 2018, 26(2): 492-502. (in Chinese)
DANELLJAN M, ROBINSON A, KHAN F S, et al .. Beyond correlation filters: learning continuous convolution operators for visual tracking[C]. European Conference on Computer Visio , 2016: 472-488.
DANELLJAN M, BHAT G, KHAN F S, et al .. ECO: Efficient convolution operators for tracking[C]. Computer Vision and Pattern Recognition , 2017: 6931-6939.
WANG N, SHI J, YEUNG D Y, et al .. Understanding and diagnosing visual tracking systems[C]. ICCV , 2015: 3101-3109.
王暐, 王春平, 李军, 等.特征融合和模型自适应更新相结合的相关滤波目标跟踪[J].光学 精密工程, 2016, 24(8):2059-2066.
WANG W, WANG CH P, LI J, et al .. Correlation filter tracking based on feature fusing and model adaptive updating[J]. Opt. Precision Eng. , 2016, 24(8): 2059-2066. (in Chinese)
FORSYTH D. Object detection with discriminatively trained part-based models[J]. Computer , 2014, 47(2): 6-7.
BHAT G, JOHNANDER J, DANELLJAN M, et al .. Unveiling the power of deep tracking[J]. arXiv preprint arXiv: 1804.06833, 2018.
TAHMASBI F, SAKI S, SHOKOUHI B. Classification of benign and malignant masses based on zernike moments[J]. Computers in biology and medicine , 2011, 41(8): 726-735.
SAKI F, TAHMASBI A, SOLTANIAN-ZADEH H, et al .. Fast opposite weight learning rules with application in breast cancer diagnosis[J]. Computers in biology and medicine , 2013, 43(1): 32-41.
郝敏, 麻硕士, 郝小冬.基于Zernike矩的马铃薯薯形检测[J].农业工程学报, 2010, 26(2):347-350.
HAO M, MA SH SH, HAO X D. Potato shape detection based on Zernike moments[J]. Transactions of the CSAE , 2010, 26(2): 347-350. (in Chinese)
0
浏览量
166
下载量
5
CSCD
关联资源
相关文章
相关作者
相关机构