浏览全部资源
扫码关注微信
北京航空航天大学仪器科学与光电工程学院2. 北京航空航天大学 仪器科学与光电工程学院 测控系3. 北京航空航天大学仪器科学与光电工程学院院长
收稿日期:2010-12-27,
修回日期:2011-01-27,
网络出版日期:2011-10-21,
纸质出版日期:2011-08-25
移动端阅览
魏振忠,张博,张广军. 双机器人系统中的快速手眼标定方法[J]. 光学精密工程, 2011, 19(8): 0-0.
WEI Zhen-Zhong,ZHANG Bo,ZHANG An-Jun. Rapid hand-eye calibration in dual robots system[J]. Editorial Office of Optics and Precision Engineering, 2011, 19(8): 0-0.
针对双机器人仿真测量系统中的手眼标定问题,提出一种由视觉方法求解法兰盘位姿得出手眼关系的方法。将目标机器人运动到合适的位姿,由视觉机器人拍摄其法兰盘图像,提取图像中法兰盘的椭圆轮廓,解算摄像机坐标系下的法兰盘姿态和圆心坐标,并由销孔位置约束得出摄像机与目标法兰盘坐标系转换关系H1。由控制器读数得出两台机器人各自法兰盘坐标系与基坐标系间的转换关系H2、H4,由机器人单轴旋转运动得出双机器人基坐标系转换关系H3,便可形成闭环得出机器人手眼关系HCG=H4H3H2H1。将法兰盘运动到共面的多个不同位置分别拍摄图像,由图像融合可提高标定精度。实验结果表明,单位置标定和多位置图像融合标定的精度分别为0.345度和0.187度,满足双机器人视觉仿真测量系统的精度要求。
Aiming at hand-eye calibration in dual robots measurement system
a method based on machine vision by calculating target robot flange pose and center coordinate is presented. Move target robot flange to a proper pose to get an image by the camera
extract ellipse contour of the flange in the image
calculate the flange pose and its circle center data according to the camera coordinate system
then use location of the pinhole on the flange to obtain coordinate transformation H1 between camera coordinate system and flange coordinate system. Record coordinate transformations between flange coordinate system and robot base coordinate system separately
marked as H2 and H4
from the robot controllers. The coordinate system transformation H3 between two robots can be derived from single axis movements of robots
then the hand-eye coordinate transformation can be calculated by HCG=H4H3H2H1. Move the target flange to some coplanar poses and get their images
the calibration accuracy can be improved by image fusion. Experimental results indicate that the calibration precision of single image and coplanar poses using image fusion is 0.345° and 0.187° separately. It can satisfy the dual robot system requirements of vision guiding measurement.
ROGER Y T, REIMAR K L. A New Technique for Fully Autonomous and Efficient 3D Robotics Hand/Eye Calibration [J]. IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, 1989, 5(3): 345-357.[2] 毛剑飞. 机器人视觉标定及离线编程技术研究 [D]. 浙江:浙江大学,2004.MAO J F. Research on Computer Vision Calibration and off-line Programming [D]. Zhejiang: Zhejiang University, 2004. (in Chinese) [3] 朱振友, 徐爱杰, 林涛,等. 机器人视觉的“手-眼”关系快速标定算法 [J]. 光学技术, 2004, 30(2): 150-152.ZHU ZH Y, XU A J, LIN T, et al.. High-speed cal ibration method for the relationship of the eye2in2hand of robot vision [J]. OPTICAL TECHNIQUE, 2004, 30(2): 150-152 . (in Chinese) [4] 王一, 刘常杰, 任永杰,等. 通用机器人视觉检测系统的全局校准技术 [J]. 光学精密工程, 2009, 17(12): 3028-3033.WANG Y, LIU CH J, REN Y J, et al.. Global calibration of visual inspection system based on universal robots [J]. Optics and Precision Engineering, 2009, 17(12): 3028-3033 . (in Chinese)[5] Product manuals. ABB Robot Documentation-IRB1400 M2004 [M]. Robotics, Sweden:ABB Automation Technologies AB, 2005.[6] REZA S R, IVO T, KENNETH C S. et al.. Three-Dimensional Location Estimation of Circular Features for Machine Vision [J]. IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, 1992, 8(5): 624-638.[7] 魏振忠, 赵征, 张广军. 空间圆姿态识别二义性的角度约束消除 [J]. 光学精密工程, 2010, 18(3): 685-691.WEI Z Z, ZHAO ZH, ZHANG G J, Solution of duality in pose estimation of single circle using Euclidean angular constraint [J]. Optics and Precision Engineering, 2010, 18(3): 685-691 . (in Chinese) [8] ZHAO ZH, WEI ZH ZH, ZHANG G J. Estimation ofprojected circle centers from array circles and its application in camera calibration [C]. 2009 Second Asia-Pacific Conference on Computational Intelligence and Industrial Applications, Wuhan, P.R. China: PACIIA, 2009: 182-185.[9] 张广军. 机器视觉[M]. 北京:科学出版社,2005.ZHANG G J. Machine Vision [M]. Beijing:Science Press,2005. (in Chinese)[10] 张博, 魏振忠, 张广军. 机器人坐标系与激光跟踪仪坐标系的快速转换方法 [J]. 仪器仪表学报, 2010, 31(9): 1986-1990.ZHANG B, WEI Z Z, ZHANG G J, Rapid coordinate transformation between a robot and a laser tracker [J]. Chinese Journal of Scientific Instrument, 2010, 31(9): 1986-1990 . (in Chinese)
0
浏览量
17
下载量
0
CSCD
关联资源
相关文章
相关作者
相关机构