提出了结合全局信息的SIFT特征匹配算法,解决了图像中存在多个相似区域时造成误匹配的问题。首先,在尺度空间检测出特征点;其次,生成特征向量,特征向量包含两部分:基于局部信息的SIFT向量和基于全局形状信息的全局向量;最后,采用BBF(Best Bin First)算法进行搜索,并采用欧式距离作为度量函数进行特征向量的匹配。实验结果表明:改进后的算法能够在更大的范围内表征特征点的信息,所用图像误匹配的概率由19%下降到了11%,改善了匹配效果。
Abstract
When an image has multiple similar regions
SIFT descriptors are similar
which results in much mismatches. Aiming at the problem presented above
a improved SIFT features matching algorithm with global context vector is presented. Firstly
detect feature points in scale space . Then descriptors which consist of a SIFT descriptor representing local properties and a global context vector to disambiguate locally similar features are computed. Finally
according to BBF searching strategy
match the feature vectors using Euclidean distance. The experimental results show that the improved algorithm which can describe feature points in a larger region reduces mismatch probability of experimental images from 19% to 11%