外文原文.pdf

打孔机器人的标定和定位外文文献翻译、中英文翻译

收藏

压缩包内文档预览:
预览图 预览图 预览图 预览图 预览图 预览图 预览图 预览图 预览图 预览图 预览图
编号:29068726    类型:共享资源    大小:602.04KB    格式:ZIP    上传时间:2019-12-04 上传人:QQ14****9609 IP属地:陕西
12
积分
关 键 词:
打孔 机器人 标定 定位 外文 文献 翻译 中英文
资源描述:
打孔机器人的标定和定位外文文献翻译、中英文翻译,打孔,机器人,标定,定位,外文,文献,翻译,中英文
内容简介:
ORIGINAL ARTICLEHandeye calibration and positioning for a robotdrilling systemQiang Zhan&Xiang WangReceived: 7 September 2010 /Accepted: 3 November 2011 /Published online: 22 November 2011# Springer-Verlag London Limited 2011Abstract In the aircraft manufacturing, drilling large amountof assembly holes in aircraft board is one of the keybottlenecks of production efficiency. To enhance the efficien-cy and quality of assembly holes manufacturing, robotdrilling system replacing manual operation becomes moreand more urgent. Normally, a robot system needs accuratemathematical models of the manufactured object and theenvironment when its working; as a matter of fact, becauseof the manufacturing error, the homogeneity between aircraftboard and its mathematical model is dissatisfied. So a handeye vision system is introduced to realize the positioning ofthe end effector in order to improve the flexibility androbustness of a robot drilling system. The paper discusses thecalibration and positioning of a handeye vision system for arobotic aircraft board drilling system. Because the drill mustbe vertical and keep a fixed distance to the aircraft boardsurface before drilling, the depth information of handeyerelationship is neglected and by defining an intermediatescene coordinate system the handeye relationship betweenthe robot coordinate system and the vision coordinate systemis established. Then the position of target point can bedescribed in the robot coordinate system by using thecalibrated handeye relationship, and thus the navigationinformation for the robot drilling system can be provided.Experimental results of the calibration and positioning of thehandeye vision of a robot drilling system is provided, andthe main factors that affect the positioning error are analyzed.Keywords Robot drilling system.Handeye vision.Calibration.Positioning1 IntroductionWith the development of industrial robot technology, theapplication of industrial robot is spreading widely 1. Theaircraft manufacturing also needs to realize techniqueupgrading by introducing robot into it 2. Usually theamount of assembly holes in a large aircraft board can reachthousands and the material (for example titanium) is usuallyhard to manufacture, so drilling assembly holes is one ofthe key bottlenecks in aircraft manufacturing. Traditionally,assembly holes are drilled manually by workers and its timeconsuming and low precision usually greatly impair theefficiency, quality, and homogeneity of aircraft production.However, as a new technique, robot drilling system canremarkably improve the drilling efficiency, quality, andhomogeneity, and it is demonstrated that the efficiency of arobot drilling system is two times that of the traditionalmanual drilling 3. Therefore robot drilling system canplay an important role in the aircraft board drilling. Usually,mathematical models of the manufactured object andexternal environment must be built for the robot drillingsystem; because of manufacturing error, the homogeneitybetween aircraft board and its mathematical model iscommonly dissatisfied, and the drilling accuracy cannotbe ensured if only depending on the mathematical model. Inorder to realize self-positioning and automatic drilling, avision system is very necessary for a robot drilling system.At present, such aircraft manufacturing companies asBoeing and Airbus have already used flexible robot drillingsystem for holes processing in aircraft assembly. TheONCE (One Sided Cell End Effector) robot drilling systemhas successfully been used in the manufacturing line ofF/A-18E aircraft. The ONCE system can precisely locate awork piece by using laser vision system and many otherexpensive equipments, but the disadvantage is high costQ. Zhan (*):X. WangBeihang University,Beijing, Chinae-mail: Int J Adv Manuf Technol (2012) 61:691701DOI 10.1007/s00170-011-3741-44. If we use camera-based vision system to assist drillingand positioning, the cost will be reduced greatly, so that therobot drilling system will be greatly popularized. Thehandeye vision can play an important role in the robotdrilling system, but the calibration method of handeyevision is a difficulty for a real application. Therefore, agreat deal of research about calibration method of handeyevision has been done.In early research, by controlling robot arms movement,a three-dimensional calibration object with known structureis photographed from different directions and then theconstraint handeye relationship is established and theintrinsic and extrinsic parameters of the camera is obtainedby solving equations 5, 6. The limited productiontechniques of early lens result in the big lens distortionrate so the calibration errors of those early calibrationmethods may be big if without considering lens distortion.The progresses of some later researches have consideredlens radial distortion 7 and non-linear optimization 8, 9,so that the efficiency and precision of calibration wereimproved. Additionally, some studies can obtain not onlythe handeye relation but also the relation between robotbase and world coordinate system by combining traditionalcalibration methods with robot quaternion equations.However, those methods still cannot avoid computing alarge number of homogeneous matrices with the form AX=BX 6. Besides, some of those methods need expensiveauxiliary equipments for calibration and the calibrationprocess is usually cumbersome. To this end, people beganto study how to simplify the calculations and avoid usingexpensive auxiliary equipments 1015. The self-calibration methods based on active vision can realizecalibration without expensive auxiliary equipment and theyare also low cost and have simple calibration process, butthe calibration accuracy is lower than traditional calibrationmethods 1620. In a word, the paradox between thecomputation complexity and the calibration precision hasnot been well solved till now.When a robot drilling system is working, the drill in theend effector must be perpendicular to the surface of a workpiece and the distance between the drill bit and the surfaceof the work piece is constant, so the depth information ofvision is not necessary for the robot drilling system.Furthermore, the depth information of monocular vision iscommonly difficult to calculate and time consuming. Inorder to meet the requirements of the robot drilling system,a highly efficient and accurate calibration method isimportant. Therefore, according to the characteristics ofindustrial robot, this paper presents a calibration methodwhich omits the depth information of handeye vision, sothe handeye relation can be simplified to a two-dimensionrelationship. As an intermediate coordinate system, thescene coordinate system is defined to set up the relationshipbetween the camera imaging coordinate system and therobot end effector, namely handeye relationship. Usually,most calibration methods calculate the intrinsic andextrinsic parameters of a camera simultaneously and itneeds complex calculation of matrix transformation of themulti-coordinate systems. By contrast, this paper only getsthe extrinsic parameters of the camera so that thecalculation is greatly simplified.This paper is organized as follows. The calibrationmethod will be presented in Section 2 and experimentresults will be given in Section 3. In Section 4, the primaryerror analysis of positioning is analyzed and a conclusionwill be given in Section 5 at last.2 The calibration and positioning methodfor the handeye relationshipTraditional calibration methods use coordinate transfor-mation relationships to resolve the intrinsic and extrinsicparameters of a camera, which are then taken into thecoordinatetransformation againwhen locatingtheposition of a robots end effector 21, 22. The complexand massive matrix computation of those methods leads torounding error and low positioning accuracy. In order toavoid the above shortcomings of those calibration meth-ods, an approach unifying the handeye relationshipcalibration and the robot end effector location is proposed,which realizes the positioning of a robot end effector withindirect handeye relationship rather than direct handeyecoordinates transformation so as to make the calibrationbe greatly simplified.2.1 Camera imaging modelThe complexity of an optical camera imaging model is oneof the key aspects that affect the complexity of a calibrationmethod. The pinhole imaging model is the one used widelyand its geometrical relationship is linear 23.Due to the lens production techniques, the actual imagewill be affected by various non-ideal factors such as lensdistortion. However, for a real robot drilling system, thevision scope is small (3040 mm) and the distance betweenthe camera imaging plane and the work piece plane is short(less than 200 mm), together with the well-controlleddistortion of industry lens, so positioning accuracy willnot be much impacted by the ideal pinhole imaging model.Furthermore, the aircraft board to be drilled is a large planeand when the camera optical axis is vertical to it the changeof the plane depth is very small comparing with thephotographing distance, so a fixed depth value can be usedand the perspective model is simplified as the weakperspective model 11.692Int J Adv Manuf Technol (2012) 61:6917012.2 Determining the handeye relationshipFigure 1 shows the coordinate systems of a robot drillingsystem. The work piece coordinate system is defined asOwXwYwZw, the end effector coordinate system or the toolcoordinate system is defined as OEXEYEZE, the cameraimaging coordinate system is defined as OCXCYCZC, thebase coordinate system of the robot is defined asORXRYRZR.By calibrating, the relationship between the toolcoordinate system (i.e. the robot end effector coordinatesystem) and the work piece coordinate system and therelationship between the camera imaging coordinatesystem and the work piece coordinate system can beboth obtained, so the relationship between the toolcoordinate system and the camera imaging coordinatesystem can be obtained indirectly and then the relation-ship of handeye is obtained.First, the position and pose of the tool coordinate systemin the robot base coordinate system need to be attained.Commonly, commercial industrial robots usually adopt the“4-Point” method to calibrate the tool coordinate systemand it can calibrate the drill tip of the end effector as theTCP (tool center point), which is the origin of the toolcoordinate system 24.Then, the relationship between the camera imagingcoordinate system and the work piece coordinate systemwill be obtained by using calibration template. The topleft vertex of the image is chosen as the origin of thecamera imaging coordinate system. Here, only theexternal factors of the camera or the relationship ofhandeye needs to be calculated, so the pixel coordinatesystemdoesnotneedtobeestablishedandthedimension of pixel unit and the angle between pixelunit and pixel coordinate system can also be ignored.Equation 1 denotes the relationship between the point inwork piece coordinate system and the corresponding pointin imaging coordinate system.xcyczc12666437775RP01#xwywzw126664377751Where, (xw,yw,zw) is the coordinates of a point in thework piece coordinate system and (xc,yc,zc) is the coor-dinates of the corresponding point in the image coordinatesystem. R and P are rotation matrix and translation matrixbetween the work piece coordinate system and imagingcoordinate system respectively.According to the drilling techniques the drill tip shouldbe vertical to the work piece plane, so only positioning intwo dimensions is needed. On the basis of weak projectionmodel, if the optical axis of the camera is vertical to thework piece plane the rotation matrix can be simplified.Accordingly, Eq. 1 can be simplified to a two-dimensionalone so as to simplify the calibration.Figure 2 is the schematic diagram of the calibrationmethod for the handeye relationship on a two-dimensionalplane. As shown in Fig. 2, point p is the TCP at the endeffector and also the origin of the tool coordinate system.X0wand Y0ware two axes parallel with the X- and Y-axes ofthe work piece coordinate system, respectively. Accordingto the pinhole imaging principle and weak perspectivemodel, the dimensions of the photographed region are fixedwhen the photographing distance of a camera is fixed.Therefore, we can establish a scene coordinate systemXsOsYs which origin resides on the work piece and twoaxes parallel with those of the camera imaging coordinatesystem. In Fig. 2, P1sand P2sare two points in the scenecoordinate system located on the photographed plane,suppose its coordinates are (x1s,y1s) and (x2s,y2s), respectively.Fig. 1 Coordinates systemsof a robot drilling systemInt J Adv Manuf Technol (2012) 61:691701693Corresponding to P1sand P2s, P1C, and P2Care two points inthe camera imaging coordinate system and its coordinates are(x1C,y1C) and (x2C,y2C), respectively. In the camera imagingcoordinate system, the angle between line p1Cp2Cand the Xcaxis is g tg?1y1C?y2Cx1C?x2C. Because of the linear and parallelrelationship between the imaging coordinate system and thescene coordinate system, the angle between line p1Sp2Sandthe Xsaxis is also . Points P1sand P2scan be touched byTCP through robot teaching, then the distance between P1sand P2sand the angle between line p1Sp2Sand the X0Waxiscan be both obtained from the coordinate transformation andcalculation of the industrial robot, so the rotation anglebetween the scene coordinate system and the work piececoordinate system is f q ? g. Simultaneously, the propor-tion coefficient at this photographing distance is k p1Cp2Ckkp1Sp2Skk,p1Cp2Ckk is the pixel value of the distance of the two pointsin the imaging coordinate system, p1Sp2Skk is the distance ofthe two points in the scene coordinate system. Upon that, thecoordinatesofP1sand P2sin the scene coordinate system canbe calculated withxSyS# kxCyC#. When photographingand point touching with TCP, the position of TCP canbe obtained from industrial robot system, so at shootingtime the distancepp1Skk between TCP and the touchedpoint as well as the angle between line pp1Sand the X0Waxis can be obtained. Because the angle between the X-axis of the scene coordinate system and that of the workpiece coordinate system has been calculated, the anglebetween line pp1Sand the X-axis of the scene coordinatesystem can also be calculated with w d ? f. Then D pp1Skk ? cosw and H pp1Skk ? sinw can be obtained bydecomposingpp1Skk along the two axes of the scenecoordinate system.After the coordinates of P1sin the scene coordinatesystem (x1s,y1s) are obtained, the position of TCP (i.e., thecoordinates of p show in Fig. 2) in the scene coordinatesystem can be calculated:xpyp#dh#x1S? DH y1S#kx1C? pp1Skk ? cos d g ? qpp1Skk ? sin d g ? q ky1C#2Because the linear proportional parameter k has beenobtained above, the relationship between TCP and thecamera imaging coordinate system is obtained indirectly bythe scene coordinate system, in other words, the relativerelationship of handeye is obtained.2.3 Positioning of the robot end effectorDifferent from those general calibration methods thatobtaintheabsoluterelationshipofhandeye,thecalibration method proposed in the paper obtains therelative relationship between the TCP and the camera.Fig. 2 The handeyerelationship on a plane694Int J Adv Manuf Technol (2012) 61:691701In order to guarantee precise positioning, the distanceand relative pose between the camera and the planephotographed (including calibration board and workpiece) should be the same in both positioning andcalibrating. When drilling, the end effector should beperpendicular to the work piece plane, so if the camera isperpendicular to the calibration board plane when calibrat-ing the pose between the camera and the plane photo-graphed can be the same in both calibrating andpositioning. The condition that the end effector isperpendicular to the work piece plane can be ensuredaccording to the pose of the work piece obtained throughthe “3-point” calibration method of industrial robot system.Proper assembly can ensure that the pose of the camera iscoincident with that of the end effector. If the condition ofpose and photographing distance is satisfied, the endeffector can realize positioning with the calibrated data.The positioning method is showed in Fig. 3. In thecondition where the pose and photographing distance arekept fixed when calibrating, the proportion coefficient kand the relative pose between the work piece coordinatesystem and the scene coordinate system, as well as theposition coordinates of TCP in the scene coordinatesystem, will be kept fixed. In Fig. 3, X0Wand Y0Ware thetwo axes parallel with those of the work piece coordinatesystem.When a camera is photographing point A, the coor-dinates of point A (corresponding to A) in the cameraimaging coordinate system can be obtained, (xAC,yAC), thenthe coordinates of point A in the scene coordinate systemare obtained by (xAS,yAS)=k (xAC,yAC). Suppose point P isthe projection of TCP to the scene coordinate system,which coordinates in the scene coordinate system are (d, h )(see Eq. 2), shown in Fig. 3. Then the projections of a lineconnecting point A and TCP on the two axes of the scenecoordinate system are H xAS? d and D h ? yAS.According to the geometry relationship, the distance frompoint A to TCP and the angle between the X-axis of thescene coordinate system and the line from A to TCP can beboth attained,PAkk ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiH2 D2p, w tg?1 HD. Then theangle between the X0Waxis of the work piece coordinatesystem and the line from A to TCP is d w f, and theoffset along the two axes can be calculated by decomposingPAkk,PAkk ? cosd, andPAkk ? sind, so the coordinatesof point A in the work piece coordinate system are the sumof the current coordinates of TCP and the offset. It can beseen that any point on the wok piece plane can be locatedby using this method and it can guide the drilling operationof the robot end effector.3 Experiments and results3.1 Experiments in the laboratoryThe robot used in the handeye vision calibration andpositioning experiments in laboratory is IRB1410 6-dofindustrial robot of ABB, which load is 5 kg and repeatablepositioning accuracy is 0.05 mm. The camera system usedin the experiments is composed of an industrial camera withtwo million pixels from PointGray Company and a lenswith 25 mm focal length and 0.01% distortion fromMyutron Company.Fig. 3 Schematic diagram ofpositioningInt J Adv Manuf Technol (2012) 61:691701695The calibration template and the reference points forpositioning are designed by CAD software and printed in theprofessionalpaperbyahighprecision(1,200dpi)laserprinter.As shown in Fig. 4, the black points with a diameter 5 mmand two cross lines with 0.1 mm width in the point center areused for accurate positioning. A steel stick with a sharp tip isused as a calibrating bar and fixed to the end of the robot,shown in Fig. 5. The sharp tip of the calibrating bar can becalibrated as the TCP of the tool coordinate system. Indifferent tool coordinate systems based on different poses ofplanes, the comparison results between the positioningresults and the real coordinates are shown in Table 1.The deviation between the real coordinates and thepositioning coordinates along X- and Y-axes are defined asx and y, respectively. The positioning error can becalculated with E ?ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffix2y2p. It can be concludedfrom Table 1 that the positioning error of the calibrationmethod is less than 0.3 mm, which can be used for theexperiments of robot drilling system in the holes processingof the aircraft board.3.2 Experiments for the robot drilling systemBecause the calibration method in this paper is used forpractical application, the vision system used to examineand verify the calibration algorithm is fixed to the robotend effector and many experiments have been done. Therobot drilling system uses ABB IRB6640 industrial 6-DOF robot with a load of 235 kg and repetitive accuracyof 0.075 mm. The master controller of the entire robotdrilling system is based on an industrial control comput-er, and the structure of the system is shown in Fig. 6.The work piece for drilling experiment is a 3-mm thicktitanium board. In order to test the positioning accuracy,holes were drilled first according to the predeterminedpositions. That is to say, the initial position on the titaniumboard is a reference hole drilled manually, which iscombined with the CAD model to guide the positioningof the holes to be drilled. Then the positions of thosemanufactured holes are measured by a high accuracy lasertracker from Leica Company (LTD600, 10 m/m) so thatthe real coordinates of those holes are obtained. Last, bycomparing the positioning results from vision with themeasured positions from laser tracker, the positioningerror can be obtained. Figure 7 shows the robot drillingsystem for experiments.The results of positioning are showed in Table 2, where thedeviation between the real coordinates and the positioningcoordinates along X- and Y-axes are defined as x and yrespectively. The positioning error can be calculated withE ?ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffix2y2p. From the results of Table 2, we cansee that the positioning accuracy which includes thepositioning errors of the robot and the end effector is lessthan 0.4 mm. The required positioning accuracy of the robotdrilling system is 0.5 mm, so the handeye vision systemcan be used for the aircraft board drilling. In experiments, ittakes less than 200 ms to finish the vision processing whichcomprises the transmission of data, and because thepositioning algorithm only consists of the basic mathematiccomputations, the positioning operation is fast enough toensure the high efficiency of the robot drilling system4 Analysis of positioning errorThere are many factors that affect the positioning accuracyof the robot drilling system. Besides the positioningFig. 5 Robotic vision positioning experiment systemFig. 4 The vision calibration modeling template696Int J Adv Manuf Technol (2012) 61:691701accuracy of the robot and the end effector, the man-madeerror of calibration can also produce an important effect onthe final positioning accuracy. According to the results ofexperiments, we find that the main factor of positioningerror is produced by the non-perpendicularity of the cameraoptical axis to the work piece surface. In experiments, wemake the work piece to be not perpendicular to the camera.In this case, by calibration we get the magnification k=37.7and the depth between the camera and the work piecesurface is z=186 mm. The coordinate values of TCP in thescene coordinate system are (8.61, 282.46) and theseparation angle between the camera imaging coordinatesystem and the work piece coordinate system is =269.47.With these parameters, we conducted some positioningexperiments on the same target point by changing thecameras position and the results of experiments are showedin Table 3.In Table 3, the image coordinates are obtained bymoving the robot end effector and photographing the sametarget point. x and y are the deviations from the twoaxes, respectively. First, by moving the camera, the targetpoint is relatively moved approximately 8 mm along the X-axis then 5 mm along the Y-axis in the scene coordinatesystem. As shown in Table 3, the image coordinate valuesof target point change as the robot end effector moves, itcan be seen that x changes about 0.6 mm and y changesabout 0.35 mm every time. Although the angle between thecamera optical axis and the work piece surface is notexactly known, the influence of positioning error caused bythis reason is remarkable.Fig. 6 Structure of the robotdrilling systemTable 1 Experiment results ofpositioningNumberReal coordinatesPositioning coordinatesx, mmy, mmPositioningerror, mm1(1,168.24, 5.19)(1,168.25, 5.16)0.010.030.032(1,176.87, 81.80)(1,176.99, 81.77)0.120.030.123(1,132.69, 35.58)(1,132.87, 35.44)34(1,086.87, 8.79)(1,087.01, 8.68)85(1,135.85, 66.08)(1,135.98, 66.00)0.130.080.156(1,117.18, 40.99)(1,117.04, 40.90)0.140.090.177(1,095.93, 36.56)(1,095.81, 36.38)28(1,126.31, 28.24)(1,126.41, 28.11)6Int J Adv Manuf Technol (2012) 61:691701697In order to make better use of this calibration methodand decrease the positioning error, the influence ofpositioning error caused by the angle between the cameraoptical axis and the normal line of work pieces surface isanalyzed. Because the actual surface of work piece is not anideal plane, the plane of the work piece coordinate systemdetermined by “3-point” method is not parallel with theactual surface of the work piece. In this case, the robot endeffector and the camera optical axis are hardly perpendic-ular with the surface of work piece, so the positioning erroris occurs. If the camera optical axis is not perpendicularwith the plane to be photographed, the coordinates of targetpoint in camera imaging coordinate system will be differentfrom what are obtained in the ideal condition. For a betteranalysis, we refer to the basic principle of non-perpendicularity studied in 25 and the model is shownin Fig. 8.In Fig. 8, O is the optical center, OA is the cameraoptical axis, AC is the ideal plane to be photographed andEF is the imaging plane. Ideally, the photographed plane isparallel with the imaging plane, that is ACEF. In fact, theactual plane is not perpendicular to the camera optical axisOA. Instead, the actual plane AC shown in Fig. 8 is rotatedan angle from the ideal plane. The length of the object is notchanged, so AC=AC. As shown in Fig. 8, by the reasonthat the camera optical axis is not perpendicular with theplane photographed, there is a deviation of the targetsposition in the imaging plane, it is FF0 EF ? EF0. We canmark OE=f, OA=z, AC0 AC s, then EF0fzAC,EF fzAD and FF0fAC?ADzfzDC, because AB AC ? cosq and BDC is similar to ADO, so DB ADOAC0B ADOA? AC0? sinq, then we can obtainAD AB ? DB AC ? cosq ?ADOA? AC0? sinq scosq ?ADz? sinq3Solve Eq. 3, there is AD zscosqzssinq, so DC AC?AD s1 ?zcosqzssinq, at last we can getFF0fsz1 ?zcosqz ssinq4Fig. 7 The robot drilling system in experimentTable 2 Positioning resultsNumberReal coordinatesPositioning coordinatesx, mmy, mmPositioningerror, mm1(330.821, 1,531.757)(331.144, 1,531.702)0.3230.0550.332(330.799, 1,663.674)(330.953, 1,663.639)0.1540.0350.163(311.658, 231.142)(311.42, 231.225)0.2380.0830.254(331.377, 252.995)(331.635, 253.026)0.2580.0310.265(1,693.837, 1,655.602)(1,693.933, 1,655.627)0.0960.0250.16(1,703.69, 1,717.52)(1,703.929, 1,717.833)0.2390.3130.397(1,724.423, 1,903.001)(1,724.438, 1,903.483)0.2250.2650.358(1,777.525, 2,035.865)(1,777.3, 2,036.13)0.3340.1290.369(2,102.538, 2,118.027)(2,102.872, 2,118.156)0.2230.2540.3410(2,128.753, 1,549.558)(2,128.976, 1,549.812)0.0980.1670.19Table 3 Positioning error valuesNumberImage coordinatesx, mmy, mm1(162.36, 102.54)1.280.832(460.11, 111.34)0.660.863(736.58, 107.85)0.060.844(1,040.76, 106.15)0.680.885(1,042.55, 309.03)0.710.656(1,045.13, 497.52)0.730.287(1,041.62, 696.89)0.690.038(1,041.37, 992.68)0.680.41698Int J Adv Manuf Technol (2012) 61:691701Nowadays, the absolute error of industrial robot canreach 1 mm 26. Then the deviation angle between thecalibrated plane and the actual plane can be calculated byequation a cos?1 ed, e and d represent the error of workpiece calibration and the distance of the calibrated region,respectively. So if the region to be calibrated within 0.5 mby using “3-point” calibration method, the calibrated planeof the work piece will deviate only 0.12 from the actualplane. Furthermore, the field of view is limited (3040 mm) and the distance between the target point and theoptical axis is much less than the photographing distance.Thus, the Eq. 4 can be simplified as:FF0fsz1 ? cosq5From Eq. 5, it can be seen that the deviation in theimaging coordinate system is approximately linear with cos and s which means the offset distance between the targetpoint and the camera optical axis 25. This deviation andthe linear relationship in the imaging coordinate system canbe introduced into the algorithm of positioning and then theinfluence of positioning error can be analyzed. Althoughthe positioning error can be divided into components alongX- and Y-axes, respectively, the influence of the non-perpendicularity of the camera optical axis and the workpiece plane is the same along the two axes, so the analysisof the positioning error along any axis can show therelationship between the non-perpendicularity and thepositioning error.As Fig. 9 shows, point p is the TCP, the actual positionof target point is point A and the position with positioningerror is point B. AB is the deviation in the cameraimaging coordinate system, which is relative with AB inthe scene coordinate system. From Eq. 5, we can getAB k ? A0B0 kfsz1 ? cosq, f, s, z, and are theFig. 8 Model of imaging errorFig. 9 Schematic diagram foranalysis of positioning errorInt J Adv Manuf Technol (2012) 61:691701699same with above, then the distance between point B andpoint p can be calculated:pBkk ffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiH2D kfsz1 ? cosq?2s6Choosing the suitable parameters as D=100 mm, H=200 mm, k=30, f=25 mm, z=150 mm, according to Eq. 6,we can calculate the positioning errors caused by the offsetdistance between target point and camera optical axis.Figure 10 shows the positioning error changes with theoffset distance when =0.1 and =0.2, respectively, andFig. 11 shows the positioning error with angle when s=5and 10 mm, respectively.From the experiments and above analysis, we cansee that the positioning error changes with angle andoffset distance s. Thus, the angle between cameraoptical axis and the normal line of the plane is a mostimportant influence factor to the positioning error.Therefore, in order to improve positioning accuracy, wecan proceed from the following two aspects: on one hand,by using fine-turning device we can adjust the cameraoptical axis to be perpendicular with the work piece planeas much as possible; on the other hand, put the targetpoint in the center of the image when photographing sothat the offset distance between the camera optical axisand target point is small and then the positioning errorwill be reduced. Furthermore, by lots of experiments, thelinear relationship can be found between positioning errorand the offset distance, according to which the position-ing accuracy can be improved by introducing errorcompensation.5 ConclusionAccording to the application of robot drilling system in theaircraft manufacturing, this paper presents a calibration andpositioning method of robot handeye vision for drillingflat aircraft board. This method neglects the depth infor-mation and obtains the relative relationship of handeyeindirectly by utilizing the relationship between cameraimaging coordinate system and the scene coordinatesystem, so that, this method avoids the computation ofcomplex matrix and can be executed with less computa-tional effort and higher efficiency. Experiments demon-strate that the method is simple, practical and canachievehigh positioningaccuracywithin 0.4 mm(including the positioning error of the robot and thedrilling end effector) without expensive auxiliary cali-bration devices. Thus, this method is applicable to therobot drilling system in the aircraft manufacturing. Atlast, in order to achieve higher positioning accuracy,according to the results of experiments, this paperanalyzed the main factors that affect positioning errorand proposed some corresponding solutions.AcknowledgmentsThis work was supported b
温馨提示:
1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
2: 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
3.本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
提示  人人文库网所有资源均是用户自行上传分享,仅供网友学习交流,未经上传用户书面授权,请勿作他用。
关于本文
本文标题:打孔机器人的标定和定位外文文献翻译、中英文翻译
链接地址:https://www.renrendoc.com/p-29068726.html

官方联系方式

2:不支持迅雷下载,请使用浏览器下载   
3:不支持QQ浏览器下载,请用其他浏览器   
4:下载后的文档和图纸-无水印   
5:文档经过压缩,下载后原文更清晰   
关于我们 - 网站声明 - 网站地图 - 资源地图 - 友情链接 - 网站客服 - 联系我们

网站客服QQ:2881952447     

copyright@ 2020-2025  renrendoc.com 人人文库版权所有   联系电话:400-852-1180

备案号:蜀ICP备2022000484号-2       经营许可证: 川B2-20220663       公网安备川公网安备: 51019002004831号

本站为文档C2C交易模式,即用户上传的文档直接被用户下载,本站只是中间服务平台,本站所有文档下载所得的收益归上传人(含作者)所有。人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对上载内容本身不做任何修改或编辑。若文档所含内容侵犯了您的版权或隐私,请立即通知人人文库网,我们立即给予删除!