This study presents a motion blur object tracking method which combines the histogram of oriented gradient and hue saturation (HOGHS) with the Zernike moments features based on ECO_HC. This method was proposed to address object blur caused by the motion of the camera or the object itself. HOGHS was constructed by integrating fHOG with color features
and the properties of Zernike moments were introduced. The object was represented by combining HOGHS and Zernike moments. Furthermore
a novel quality evaluation method of response map was proposed that considers both positioning accuracy and robustness. Based on this method
an adaptive fusion strategy that leverages the complementary properties of HOGHS and Zernike moments was implemented. Experiments were performed on the motion blur sequences from the OTB-100 datasheet. Our method was compared to four existing state-of-the-art methods. Results show that the precision and success rate are 0.849 and 0.827
respectively
and the frame rate is 38.4 frame/s. The proposed method outperformed the top performing tracker from VOT-2016
namely
ECO_HC
with a relative gain of 2.3% in Pre-20 and 2.4% in area under the curve. The results show that the proposed method can effectively achieve the goal of motion blur object tracking.
关键词
Keywords
references
WU Y, LIM J, YANG M. Online object tracking: a benchmark[C]. IEEE Conference on Computer Vision and Pattern Recognition , 2013: 2411-2418.
LI Y, YANG D D, MAO N, et al .. Response adaptive tracking based on convolution neural network[J]. Chinese Journal of Liquid Crystals and Displays , 2018, 33(7): 596-605. (in Chinese)
NAH S, KIM T H, LEE K M. Deep multi-scale convolutional neural network for dynamic scene deblurring[C]. IEEE Conference on Computer Vision and Pattern Recognition , 2017: 257-265.
SHI W, CABALLERO J, HUSZAR F, et al .. Real-time single image and video super-resolution using an efficient sub-pixel convolutional neural network[C]. IEEE Conference on Computer Vision and Pattern Recognition , 2016: 1874-1883.
ZITA A, FLUSSER J, SUK T, et al .. Feature selection on affine moment invariants in relation to known dependencies[J]. International Conference on Computer Analysis of Images & Patterns , 2017: 285-295.
LIU J M, GUO J W, SHI SH. Correlation filter tracking based on adaptive learning rate and location refiner[J]. Opt. Precision Eng. , 2018, 26(8): 2100-2111. (in Chinese)
HENRIQUES J F, RUI C, MARTINS P, et al .. Exploiting the circulant structure of tracking-by-detection with kernels[J]. Lecture Notes in Computer Science , 2012, 7575(1): 702-715.
HENRIQUES J F, RUI C, MARTINS P, et al .. High-Speed tracking with kernelized correlation filters[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence , 2015, 37(3): 583-596.
BERTINETTO L, VALMADRE J, GOLODETZ S, et al .. Staple: complementary learners for real-time tracking[C]. Computer Vision and Pattern Recognition , 2016: 1401-1409.
VALMADRE J, BERTINETTO L, HENRIQUES J F, et al .. End-to-end representation learning for correlation filter based tracking[C]. Computer Vision and Pattern Recognition , 2017: 5000-5008.
MARTIN D, FAHAD S K, MICHAEL F, et al .. Adaptive color attributes for real-time visual tracking[C]. IEEE Conference on Computer Vision and Pattern Recognition , 2014: 1090-1097.
DANELLJAN M, HAGER G, KHAN F S, et al .. Discriminative scale space tracking[J]. IEEE Transactions on Pattern Analysis & Machine Intelligence , 2017, 39(8):1561-1575.
DANELLJAN M, HAGER G, KHAN F S, et al .. Adaptive decontamination of the training set: a unified formulation for discriminative visual tracking[C]. Computer Vision and Pattern Recognition , 2016: 1430-1438.
YANG D D, MAO N, YANG F C, et al .. Improved SRDCF object tracking via the Best-Buddies similarity[J]. Opt. Precision Eng. , 2018, 26(2): 492-502. (in Chinese)
DANELLJAN M, ROBINSON A, KHAN F S, et al .. Beyond correlation filters: learning continuous convolution operators for visual tracking[C]. European Conference on Computer Visio , 2016: 472-488.
DANELLJAN M, BHAT G, KHAN F S, et al .. ECO: Efficient convolution operators for tracking[C]. Computer Vision and Pattern Recognition , 2017: 6931-6939.
WANG N, SHI J, YEUNG D Y, et al .. Understanding and diagnosing visual tracking systems[C]. ICCV , 2015: 3101-3109.
WANG W, WANG CH P, LI J, et al .. Correlation filter tracking based on feature fusing and model adaptive updating[J]. Opt. Precision Eng. , 2016, 24(8): 2059-2066. (in Chinese)
FORSYTH D. Object detection with discriminatively trained part-based models[J]. Computer , 2014, 47(2): 6-7.
BHAT G, JOHNANDER J, DANELLJAN M, et al .. Unveiling the power of deep tracking[J]. arXiv preprint arXiv: 1804.06833, 2018.
TAHMASBI F, SAKI S, SHOKOUHI B. Classification of benign and malignant masses based on zernike moments[J]. Computers in biology and medicine , 2011, 41(8): 726-735.
SAKI F, TAHMASBI A, SOLTANIAN-ZADEH H, et al .. Fast opposite weight learning rules with application in breast cancer diagnosis[J]. Computers in biology and medicine , 2013, 43(1): 32-41.