外文原文.pdf_第1页
外文原文.pdf_第2页
外文原文.pdf_第3页
外文原文.pdf_第4页
外文原文.pdf_第5页
已阅读5页,还剩6页未读 继续免费阅读

外文原文.pdf.pdf 免费下载

版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领

文档简介

ORIGINAL ARTICLE Handeye calibration and positioning for a robot drilling system Qiang Zhan as a matter of fact, because of the manufacturing error, the homogeneity between aircraft board and its mathematical model is dissatisfied. So a hand eye vision system is introduced to realize the positioning of the end effector in order to improve the flexibility and robustness of a robot drilling system. The paper discusses the calibration and positioning of a handeye vision system for a robotic aircraft board drilling system. Because the drill must be vertical and keep a fixed distance to the aircraft board surface before drilling, the depth information of handeye relationship is neglected and by defining an intermediate scene coordinate system the handeye relationship between the robot coordinate system and the vision coordinate system is established. Then the position of target point can be described in the robot coordinate system by using the calibrated handeye relationship, and thus the navigation information for the robot drilling system can be provided. Experimental results of the calibration and positioning of the handeye vision of a robot drilling system is provided, and the main factors that affect the positioning error are analyzed. Keywords Robot drilling system.Handeye vision. Calibration.Positioning 1 Introduction With the development of industrial robot technology, the application of industrial robot is spreading widely 1. The aircraft manufacturing also needs to realize technique upgrading by introducing robot into it 2. Usually the amount of assembly holes in a large aircraft board can reach thousands and the material (for example titanium) is usually hard to manufacture, so drilling assembly holes is one of the key bottlenecks in aircraft manufacturing. Traditionally, assembly holes are drilled manually by workers and its time consuming and low precision usually greatly impair the efficiency, quality, and homogeneity of aircraft production. However, as a new technique, robot drilling system can remarkably improve the drilling efficiency, quality, and homogeneity, and it is demonstrated that the efficiency of a robot drilling system is two times that of the traditional manual drilling 3. Therefore robot drilling system can play an important role in the aircraft board drilling. Usually, mathematical models of the manufactured object and external environment must be built for the robot drilling system; because of manufacturing error, the homogeneity between aircraft board and its mathematical model is commonly dissatisfied, and the drilling accuracy cannot be ensured if only depending on the mathematical model. In order to realize self-positioning and automatic drilling, a vision system is very necessary for a robot drilling system. At present, such aircraft manufacturing companies as Boeing and Airbus have already used flexible robot drilling system for holes processing in aircraft assembly. The ONCE (One Sided Cell End Effector) robot drilling system has successfully been used in the manufacturing line of F/A-18E aircraft. The ONCE system can precisely locate a work piece by using laser vision system and many other expensive equipments, but the disadvantage is high cost Q. Zhan (*):X. Wang Beihang University, Beijing, China e-mail: qzhan Int J Adv Manuf Technol (2012) 61:691701 DOI 10.1007/s00170-011-3741-4 4. If we use camera-based vision system to assist drilling and positioning, the cost will be reduced greatly, so that the robot drilling system will be greatly popularized. The handeye vision can play an important role in the robot drilling system, but the calibration method of handeye vision is a difficulty for a real application. Therefore, a great deal of research about calibration method of handeye vision has been done. In early research, by controlling robot arms movement, a three-dimensional calibration object with known structure is photographed from different directions and then the constraint handeye relationship is established and the intrinsic and extrinsic parameters of the camera is obtained by solving equations 5, 6. The limited production techniques of early lens result in the big lens distortion rate so the calibration errors of those early calibration methods may be big if without considering lens distortion. The progresses of some later researches have considered lens radial distortion 7 and non-linear optimization 8, 9, so that the efficiency and precision of calibration were improved. Additionally, some studies can obtain not only the handeye relation but also the relation between robot base and world coordinate system by combining traditional calibration methods with robot quaternion equations. However, those methods still cannot avoid computing a large number of homogeneous matrices with the form AX= BX 6. Besides, some of those methods need expensive auxiliary equipments for calibration and the calibration process is usually cumbersome. To this end, people began to study how to simplify the calculations and avoid using expensive auxiliary equipments 1015. The self- calibration methods based on active vision can realize calibration without expensive auxiliary equipment and they are also low cost and have simple calibration process, but the calibration accuracy is lower than traditional calibration methods 1620. In a word, the paradox between the computation complexity and the calibration precision has not been well solved till now. When a robot drilling system is working, the drill in the end effector must be perpendicular to the surface of a work piece and the distance between the drill bit and the surface of the work piece is constant, so the depth information of vision is not necessary for the robot drilling system. Furthermore, the depth information of monocular vision is commonly difficult to calculate and time consuming. In order to meet the requirements of the robot drilling system, a highly efficient and accurate calibration method is important. Therefore, according to the characteristics of industrial robot, this paper presents a calibration method which omits the depth information of handeye vision, so the handeye relation can be simplified to a two-dimension relationship. As an intermediate coordinate system, the scene coordinate system is defined to set up the relationship between the camera imaging coordinate system and the robot end effector, namely handeye relationship. Usually, most calibration methods calculate the intrinsic and extrinsic parameters of a camera simultaneously and it needs complex calculation of matrix transformation of the multi-coordinate systems. By contrast, this paper only gets the extrinsic parameters of the camera so that the calculation is greatly simplified. This paper is organized as follows. The calibration method will be presented in Section 2 and experiment results will be given in Section 3. In Section 4, the primary error analysis of positioning is analyzed and a conclusion will be given in Section 5 at last. 2 The calibration and positioning method for the handeye relationship Traditional calibration methods use coordinate transfor- mation relationships to resolve the intrinsic and extrinsic parameters of a camera, which are then taken into the coordinatetransformation againwhen locatingthe position of a robots end effector 21, 22. The complex and massive matrix computation of those methods leads to rounding error and low positioning accuracy. In order to avoid the above shortcomings of those calibration meth- ods, an approach unifying the handeye relationship calibration and the robot end effector location is proposed, which realizes the positioning of a robot end effector with indirect handeye relationship rather than direct handeye coordinates transformation so as to make the calibration be greatly simplified. 2.1 Camera imaging model The complexity of an optical camera imaging model is one of the key aspects that affect the complexity of a calibration method. The pinhole imaging model is the one used widely and its geometrical relationship is linear 23. Due to the lens production techniques, the actual image will be affected by various non-ideal factors such as lens distortion. However, for a real robot drilling system, the vision scope is small (3040 mm) and the distance between the camera imaging plane and the work piece plane is short (less than 200 mm), together with the well-controlled distortion of industry lens, so positioning accuracy will not be much impacted by the ideal pinhole imaging model. Furthermore, the aircraft board to be drilled is a large plane and when the camera optical axis is vertical to it the change of the plane depth is very small comparing with the photographing distance, so a fixed depth value can be used and the perspective model is simplified as the weak perspective model 11. 692Int J Adv Manuf Technol (2012) 61:691701 2.2 Determining the handeye relationship Figure 1 shows the coordinate systems of a robot drilling system. The work piece coordinate system is defined as OwXwYwZw, the end effector coordinate system or the tool coordinate system is defined as OEXEYEZE, the camera imaging coordinate system is defined as OCXCYCZC, the base coordinate system of the robot is defined as ORXRYRZR. By calibrating, the relationship between the tool coordinate system (i.e. the robot end effector coordinate system) and the work piece coordinate system and the relationship between the camera imaging coordinate system and the work piece coordinate system can be both obtained, so the relationship between the tool coordinate system and the camera imaging coordinate system can be obtained indirectly and then the relation- ship of handeye is obtained. First, the position and pose of the tool coordinate system in the robot base coordinate system need to be attained. Commonly, commercial industrial robots usually adopt the “4-Point” method to calibrate the tool coordinate system and it can calibrate the drill tip of the end effector as the TCP (tool center point), which is the origin of the tool coordinate system 24. Then, the relationship between the camera imaging coordinate system and the work piece coordinate system will be obtained by using calibration template. The top left vertex of the image is chosen as the origin of the camera imaging coordinate system. Here, only the external factors of the camera or the relationship of handeye needs to be calculated, so the pixel coordinate systemdoesnotneedtobeestablishedandthe dimension of pixel unit and the angle between pixel unit and pixel coordinate system can also be ignored. Equation 1 denotes the relationship between the point in work piece coordinate system and the corresponding point in imaging coordinate system. xc yc zc 1 2 6 6 6 4 3 7 7 7 5 RP 01 # xw yw zw 1 2 6 6 6 4 3 7 7 7 5 1 Where, (xw,yw,zw) is the coordinates of a point in the work piece coordinate system and (xc,yc,zc) is the coor- dinates of the corresponding point in the image coordinate system. R and P are rotation matrix and translation matrix between the work piece coordinate system and imaging coordinate system respectively. According to the drilling techniques the drill tip should be vertical to the work piece plane, so only positioning in two dimensions is needed. On the basis of weak projection model, if the optical axis of the camera is vertical to the work piece plane the rotation matrix can be simplified. Accordingly, Eq. 1 can be simplified to a two-dimensional one so as to simplify the calibration. Figure 2 is the schematic diagram of the calibration method for the handeye relationship on a two-dimensional plane. As shown in Fig. 2, point p is the TCP at the end effector and also the origin of the tool coordinate system. X 0 w and Y 0 w are two axes parallel with the X- and Y-axes of the work piece coordinate system, respectively. According to the pinhole imaging principle and weak perspective model, the dimensions of the photographed region are fixed when the photographing distance of a camera is fixed. Therefore, we can establish a scene coordinate system XsOsYs which origin resides on the work piece and two axes parallel with those of the camera imaging coordinate system. In Fig. 2, P1sand P2sare two points in the scene coordinate system located on the photographed plane, suppose its coordinates are (x1s,y1s) and (x2s,y2s), respectively. Fig. 1 Coordinates systems of a robot drilling system Int J Adv Manuf Technol (2012) 61:691701693 Corresponding to P1sand P2s, P1C, and P2Care two points in the camera imaging coordinate system and its coordinates are (x1C,y1C) and (x2C,y2C), respectively. In the camera imaging coordinate system, the angle between line p1Cp2Cand the Xc axis is g tg?1 y1C?y2C x1C?x2C. Because of the linear and parallel relationship between the imaging coordinate system and the scene coordinate system, the angle between line p1Sp2Sand the Xsaxis is also . Points P1sand P2scan be touched by TCP through robot teaching, then the distance between P1s and P2sand the angle between line p1Sp2Sand the X 0 W axis can be both obtained from the coordinate transformation and calculation of the industrial robot, so the rotation angle between the scene coordinate system and the work piece coordinate system is f q ? g. Simultaneously, the propor- tion coefficient at this photographing distance is k p1Cp2Ckk p1Sp2Skk, p1Cp2Ckk is the pixel value of the distance of the two points in the imaging coordinate system, p1Sp2Skk is the distance of the two points in the scene coordinate system. Upon that, the coordinatesofP1sand P2sin the scene coordinate system can be calculated with xS yS # k xC yC # . When photographing and point touching with TCP, the position of TCP can be obtained from industrial robot system, so at shooting time the distancepp1Skk between TCP and the touched point as well as the angle between line pp1Sand the X 0 W axis can be obtained. Because the angle between the X- axis of the scene coordinate system and that of the work piece coordinate system has been calculated, the angle between line pp1Sand the X-axis of the scene coordinate system can also be calculated with w d ? f. Then D pp1Skk ? cosw and H pp1Skk ? sinw can be obtained by decomposingpp1Skk along the two axes of the scene coordinate system. After the coordinates of P1sin the scene coordinate system (x1s,y1s) are obtained, the position of TCP (i.e., the coordinates of p show in Fig. 2) in the scene coordinate system can be calculated: xp yp # d h # x1S? D H y 1S # kx1C? pp1Skk ? cos d g ? q pp1Skk ? sin d g ? q ky1C # 2 Because the linear proportional parameter k has been obtained above, the relationship between TCP and the camera imaging coordinate system is obtained indirectly by the scene coordinate system, in other words, the relative relationship of handeye is obtained. 2.3 Positioning of the robot end effector Different from those general calibration methods that obtaintheabsoluterelationshipofhandeye,the calibration method proposed in the paper obtains the relative relationship between the TCP and the camera. Fig. 2 The handeye relationship on a plane 694Int J Adv Manuf Technol (2012) 61:691701 In order to guarantee precise positioning, the distance and relative pose between the camera and the plane photographed (including calibration board and work piece) should be the same in both positioning and calibrating. When drilling, the end effector should be perpendicular to the work piece plane, so if the camera is perpendicular to the calibration board plane when calibrat- ing the pose between the camera and the plane photo- graphed can be the same in both calibrating and positioning. The condition that the end effector is perpendicular to the work piece plane can be ensured according to the pose of the work piece obtained through the “3-point” calibration method of industrial robot system. Proper assembly can ensure that the pose of the camera is coincident with that of the end effector. If the condition of pose and photographing distance is satisfied, the end effector can realize positioning with the calibrated data. The positioning method is showed in Fig. 3. In the condition where the pose and photographing distance are kept fixed when calibrating, the proportion coefficient k and the relative pose between the work piece coordinate system and the scene coordinate system, as well as the position coordinates of TCP in the scene coordinate system, will be kept fixed. In Fig. 3, X 0 W and Y 0 W are the two axes parallel with those of the work piece coordinate system. When a camera is photographing point A, the coor- dinates of point A (corresponding to A) in the camera imaging coordinate system can be obtained, (xAC,yAC), then the coordinates of point A in the scene coordinate system are obtained by (xAS,yAS)=k (xAC,yAC). Suppose point P is the projection of TCP to the scene coordinate system, which coordinates in the scene coordinate system are (d, h ) (see Eq. 2), shown in Fig. 3. Then the projections of a line connecting point A and TCP on the two axes of the scene coordinate system are H xAS? d and D h ? yAS. According to the geometry relationship, the distance from point A to TCP and the angle between the X-axis of the scene coordinate system and the line from A to TCP can be both attained,PAkk ffiffi ffiffi ffiffi ffiffi ffiffi ffiffi ffiffi ffiffi ffiffi H2 D2 p , w tg?1 H D. Then the angle between the X 0 W axis of the work piece coordinate system and the line from A to TCP is d w f, and the offset along the two axes can be calculated by decomposing PAkk,PAkk ? cosd, andPAkk ? sind, so the coordinates of point A in the work piece coordinate system are the sum of the current coordinates of TCP and the offset. It can be seen that any point on the wok piece plane can be located by using thi

温馨提示

  • 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
  • 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
  • 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
  • 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
  • 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
  • 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
  • 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

评论

0/150

提交评论