机械手机器人外文翻译-直观控制机器人机械手【中文3910字】【PDF+中文WORD】
收藏
资源目录
压缩包内文档预览:
编号:118201132
类型:共享资源
大小:2.34MB
格式:ZIP
上传时间:2021-03-21
上传人:资料****站
认证信息
个人认证
冯**(实名认证)
河南
IP属地:河南
18
积分
- 关 键 词:
-
中文3910字
机械手
机器人
外文
翻译
直观
控制
中文
3910
PDF
WORD
- 资源描述:
-
机械手机器人外文翻译-直观控制机器人机械手【中文3910字】【PDF+中文WORD】,中文3910字,机械手,机器人,外文,翻译,直观,控制,中文,3910,PDF,WORD
- 内容简介:
-
Intuitive control of robotic manipulators David Rusbarskya, Jeremy Grayb, Douglas Petersa aRE2, Inc., 4925 Harrison St, Pittsburgh, PA USA bU.S. Army TARDEC, 6501 E. 11 Mile Rd., Warren, MI USA ABSTRACT Under a research effort sponsored by the U.S. Army Tank Automotive Research, Development, and Engineering Center (TARDEC), we are exploring technologies that can be used to provide an operator with the ability to more intuitively control high-degree of freedom arms while providing the operator with haptic feedback to more effectively interact with the environment. This paper highlights the results of the research as well as early test results on a number of prototype systems currently in development. We will demonstrate advantages and disadvantages of some of the leading approaches to intuitive control and haptic feedback. KEY WORDS Robotic Manipulation, Haptic Feedback, Human System Integration, Computer Vision, GUI 1. INTRODUCTION In the field of unmanned ground vehicles with dexterous manipulators, current control systems require a high cognitive load and training to properly position the manipulator and have it effectively interact with its environment. As robotic manipulators grow more capable through additional degrees of freedom and as Explosive Ordinance Disposal (EOD) robots are developed that take advantage of multiple manipulators on the same platform, the demand for more intuitive control and enhanced situational awareness will also increase. The Modular Intelligent Manipulation system with Intuitive Controls project seeks to research, design, and develop technologies that will allow a user to intuitively control multiple degree of freedom robotic arms and maintain better awareness of the operating environment through haptic feedback. In addition to reporting resistance, haptic feedback can help make operators feel like they are actually there with the robot. Coupled with intuitive controls and advanced video feedback, the system will provide users with the sensation that robots are an extension of their bodies. There were three main technical objectives for the Phase I project: 1. Determine the feasibility of using various control input devices with integrated feedback to more intuitively and effectively control robotic arms. 2. Characterize the control fidelity of commonly fielded platforms and investigate the practicality of countering coarse-control manipulation via dynamic modeling techniques. 3. Demonstrate the practicality of using a dexterous end-effector with embedded force feedback sensing, improved visual feedback, multiple fingers, and wrist compliance for use on a representative robotic arm for the purpose of performing complex maneuvers such as cutting wires. This paper will concentrate on describing our findings for the first technical objective. In addition to these technical objectives, we conducted an in-depth market survey with the Allegheny County Bomb Squad to better understand the operators needs and situational constraints for using robotic manipulators. We also asked bomb squad technicians about the tricks and techniques that they employ to allow them to be more effective in coordinating the control of robotic manipulators. 1.1. Subject Matter Expert Interviews Central to the system design and engineering philosophy is ensuring that the time is taken to engage with subject matter experts to fully understand the nature of the problem, current techniques, and the challenges that operators face. During Unmanned Systems Technology XIII, edited by Douglas W. Gage, Charles M. Shoemaker,Robert E. Karlsen, Grant R. Gerhart, Proc. of SPIE Vol. 8045, 80450C 2011 SPIECCC code: 0277-786X/11/$18 doi: 10.1117/12.886223Proc. of SPIE Vol. 8045 80450C-1Downloaded From: / on 03/06/2014 Terms of Use: /terms the Phase I effort, project team members composed a list of questions regarding complex manipulation tasks. We discussed these questions with members of the Allegheny County Bomb Squad. Officers of the Allegheny County Bomb Squad are quite progressive and very eager to try and test new technologies, and will continue to be a valuable resource during future phases of this program. With the input from these expert end-users, we were able to identify specific problems as well factors to consider in developing a solution. 1.2. Problem Current Operator Control Units (OCUs) often utilize small hand-held controllers such as the game controller or similar devices. This trend has evolved from previous controls which were dial based and could only control one of the robots joints at a time. While this evolutionary jump has helped to solve the burdens of slow and often frustrating controls, the hand-held controllers have introduced a new problem. Since the hand-held controllers control only the end of the arm (aka “Flying the end effector”), there is no way to control the elbows or the rest of the arm. If the arm must maneuver through small openings or weave through different obstacles, the operator can encounter problems since moving the tip of the arm may cause the robots elbows to bump into obstacles. To solve this issue, additional controls must be incorporated to control the rest of the arm. These controls should not add to the burden of the operator, but should dramatically improve on the control systems that are currently in the field. The current feedback solutions usually include visual feedback in the form of one or more cameras placed at different locations throughout the robot in order to give operators the view that is necessary to complete their missions. While providing critical feedback on operations, visual feedback can have several shortcomings the distance to the target is often hard to judge with a single camera, it can be very difficult to judge how tight the gripper is gripping an object, and cameras might not always be positioned the way the operator needs them to be positioned. The data provided from the visual feedback systems are often crucial, so additional sensors are needed to help make up for these shortcomings. Through the use of other sensors such as laser range finders, pressure sensors, and various other sensors, it is possible to detect the distance to a target, how tightly the robot is gripping an object, and when the robot has bumped its elbow on something. Further, all this data can be transmitted, analyzed, and fed back to the operator in a fast and intuitive manner. 1.3. What to look for in a solution The features of a control system help define what is and is not possible with the robot. To control the next generation of robotic manipulators with dexterous behaviors and more degrees of freedom, a new generation of controls is needed in order to overcome the shortcomings of the current controllers. The next generation of controllers should have the following features: The system should be highly intuitive. Operators should be able to be rapidly trained. How users control the system and how the information is fed back to the operator should be as natural as possible to minimize or possibly eliminate training required to use the system. This will help to increase the number of qualified operators, reduce the cost of training the operators, and reduce the risk of mistakes due to human error while operating the robot. The operator should have full control over all the degrees of freedom that the robot has to offer. This control should not be a burden on the operator, instead it should be intuitive and feel natural for the operator to control each joint. Proper, easy to understand feedback should be given to the operator. Proper feedback refers to the type of feedback, the intensity of the feedback, and the appropriateness of the feedback. For example if the robot is digging in sand looking for landmines, the operator will probably not want to feel the sand, however contact with solid objects will be important to detect. The operator would also want to be able to tell the difference between objects such as a landmine versus a formed plastic box. The location of the feedback is important so the operator knows the difference between the robot hitting a solid object in the sand versus the robot bumping its elbow on a rock. Proc. of SPIE Vol. 8045 80450C-2Downloaded From: / on 03/06/2014 Terms of Use: /terms Usable in most environmental conditions, some of which including extreme heat/cold, rain, and darkness. The system should allow for both fine and broad movements with little to no lag or delay in the system. Lag in the system could confuse the operator or cause them to react incorrectly to what the arm is doing. Cameras should be placed in several locations to allow the operator to choose which view best suits the situation. 1.4. Example situations and what aspects of controls and feedback are important: Reaching under a bench to pick up an object / Reaching up to pick up something on a bench o Elbow position is important o Angle of hand is important o Camera location is important Unscrewing a blast cap o Precision o Two arm system might be required Using common tools screwdriver, hammer, drill o Dexterity o Precision 2. INDIVIDUAL DEVICES Research was performed to see what potential devices are currently available to help control robotic arms or otherwise provide a piece of the final device system that would provide all the additional features that the next generation of controllers will demand. 2.1. Device categories Game Controllers In general, game controllers tend to be low cost and well understood by the majority of the robot-using population. They can be common game controllers that are primarilly used for console games, or they can be devices that look similar to console game controllers, but have been ruggedized to be useful in outdoor and extreme conditions. Puppet Arms Puppet arms provide an interesting solution since haptic feedback can be built into the arm and the arm can be created with the same degrees of freedom as the manipulator to accurately represent the robot arm position and orientation. A puppet arm includes a base, arm segments that are connected to each other with sensors (and possibly motors) in the joints, and often will include buttons or some sort of interface at the end of the arm to allow for additional input. This system may or may not look like a smaller version of the robot that the user is controlling. Computer Vision A very interesting alternative is the use of computer vision to evaluate the position of a users hand and arm and have the robotic arm mimic the users movements. A challenge to this approach is providing haptic feedback to the user so this approach would need to be combined with another solution. Gloves Gloves allow accurate and detailed manipulation of complex end-effectors. Haptic feedback devices can also be embedded in gloves to provide grasping feedback to the user. Joysticks Joysticks provide the ability to provide control of a manipulator and also provide haptic feedback by altering the resistance of the joystick movement. 3D Mice 3D mice record movement in 3 dimensions. These could be combined with other technologies such as accelerometers to also measure rotational movement. Proc. of SPIE Vol. 8045 80450C-3Downloaded From: / on 03/06/2014 Terms of Use: /terms Tracking Motion tracking is often used for creating realistic animation movements for movies and video games. Motion tracking technology records head, hand, and body movements and translates them on-screen to virtual character movement. This technology provides an interesting angle for tracking arm and hand movements for the system. Again, we would need to supplement this technology with haptic feedback. Miscellaneous Sometimes in our study, we discovered a technology that didnt fit neatly into one of the above categories, but was interesting enough to include in the research. It is unlikely that these non-conventional technologies will provide a useful solution in the system. Feedback Devices Many of the technologies above provide solutions to the problem of intuitive movement but do not provide haptic feedback to the user. This category focused on technologies capable of providing the appropriate feedback to the user. Each of the above mentioned categories of controllers provides some level of benefit, but none adequately cover the range of system requirements mentioned above. Combining certain devices can allow them to work together and utilize one devices strengths. 2.2. Devices that did not make the cut In this section, we discuss some individual devices that offer a unique or advanced approach to the problem, but for one reason or another are not usable for this project or are not developed enough to be viable solution components at the time of this writing. These devices are mentioned in the hopes that further development will continue in these areas as they have good potential. Wii Remote While this game controller offers less buttons than the Xbox controller, it instead provides some additional sensing capabilities such as accelerometers which can be utilized in a more intuitive manner to control the movements of the robotic arm, the gripper, or steering the base robot. Unfortunately, the Wiimote does not currently record yaw because of its dependence on the light bar. 3D Holograms with acoustic radiation This prototype device created at the University of Tokyo offers 3D holographic images that the user can interact with through cameras that are monitoring the users hands. Haptic feedback is given in the form of acoustic radiation. This unique system allows the user to see 3D objects that appear to be in the users hand and also offers the sense of touch to further trick the human mind into thinking there is an object there that they can interact with. Electromyographic (EMG) signals Electromyography is a technique for evaluating and recording the electrical activity produced by skeletal muscles (Kamen, 2004). These signals can then be interpreted and fed into a robotic arm to control its movements. Electrorheological fluids (ERF) feedback Electro-rheological fluids (ERFs) are fluids that experience dramatic changes in rheological properties, such as viscosity, in the presence of an electric field (Pfeiffer, Mavroidis, Bar-Cohen, & Dolgin, 1999). This fluid can be utilized to provide haptic feedback to the user. Contained in a glove or sleeve, this fluid would be able to provide resistance to the users joints, for example, trying to move the robotic arm, but it is stuck or caught on an obstacle. 2.3. Data collected for input and haptic devices We recorded the following information for each candidate technology to assist in the evaluation and further research: Technology name State of development Developer Proc. of SPIE Vol. 8045 80450C-4Downloaded From: / on 03/06/2014 Terms of Use: /terms Degrees of Freedom the technology can control Type of feedback provided to the user Advantages of this technology Disadvantages of the technology Ruggedness and reliability issues General notes Websites for additional information 2.4. Analysis of individual devices A general analysis of individual devices was made based on the data that was collected in section 2.3. This analysis allowed us to determine what parts of the overall system were missing by each individual device. This information then allowed us to start pairing components in an easy and logical manner. 3. COMBINED COMPONENT OPTIONS Not all the candidate technologies researched covered both control of high-DOF arms as well as haptic feedback. Based on the relative viability of each component researched coupled with the relative need each component fulfills, we developed 10 possible system solutions that would provide the user with simple and powerful control along with meaningful haptic feedback. In addition to the 10 systems options, we included using an Xbox controller as a baseline case. We then used a panel of engineers to evaluate each candidate solution based on metrics developed for this program. 3.1. Baseline for comparison To properly determine if a device option is better than what is currently out in the field, we need to evaluate the best of what is currently out in the field. The device chosen as a baseline is the Xbox game controller utilizing vibrating motors, more specifically a “rumble pack” that can be attached to the controller. The Xbox controller is used in the same manner as if the user was playing a video game. The operator holds the controller with both hands, pressing buttons with both thumbs and fingers. Operators also use the controllers joysticks via their thumbs. 3.2. Combining Components into System Solutions With the analysis of each individual component, we were able to pair up components to make a viable system solution that would fit the criteria for an intuitive, easy to use, control device with haptic feedback. While hundreds of thousands of combinations could be formed with all the devices that were researched, the following 10 solutions were picked as the most likely to achieve the desired results as well as offer a wide range of solutions to compare against each other. 3.2.1. Single camera control utilizing piezoelectric feedback Figure 1. Single camera with piezoelectric feedback The single camera can be a small, low cost, COTS web camera that provides streaming video or practically any other type of streaming video camera. The idea is for the camera to detect and track the operators hand and arm through the use of software. Through this tracking, translations can be calculated to allow full control of a robotic hand and arm. Piezoelectric feedback is a device that is usually thin and flat and can be obtained in a variety of sizes and shapes. When an electric current is applied to the device, it is capable of bending in a curling or expanding fashion. This change in shape could be felt by the operator. Piezoelectric devices can be swapped with or used in conjunction with vibrating motors. Proc. of SPIE Vol. 8045 80450C-5Downloaded From: / on 03/06/2014 Terms of Use: /terms The camera could be placed in several different locations, depending on the angle you wish to track the operators hand and arm. Some possible locations include 1) Setting the camera in front of the user, possibly mounted on the OCU, with the camera aimed at the operator. This configuration is easy to setup and tear down. One downside to this configuration is that it can be difficult to detect range from the camera which could lead to less accurate controls in that axis. 2) Setting the camera in front of the user, looking down at the operators hand and arm. This would require more hardware to suspend a camera above the operator, but it would provide a better view than the first option and make depth perception easier. 3) Having the camera sitting on or looking over the operators shoulder would help to make the system more portable as the OCU could be routed to an eye display and enable the operator to be mobile. An over the shoulder view could potentially introduce occlusion issues as the elbow or other parts of the arm get in the way of the camera viewing the operators hand. The piezoelectric devices could be placed on the users arm and hand at each of the joints (elbow, wrist, etc). When the user attempts to bend his arm, the device would either allow it to bend or would start to apply resistance by stiffening up after certain resistance is detected by the robotic arm. This stiffening would hinder or stop the operators arm bending. Devices could also be placed in non-joint locations to help the user feel if other parts of the robot are coming in contact with obstacles, walls, or anything else provided the robot has the proper sensors to provide this feedback. The piezoelectric devices could be attached to the user via straps, sewn into existing clothing, or built into a sleeve/glove that the user could put on or remove easily. A single camera system is a small, low cost, COTS solution. Software modifications could be easily applied as different algorithms and techniques are discovered and implemented. Piezoelectric devices offer a slightly different sense of feedback than a vibrating motor, for example. By being able to control which way the piezoelectric device bends, the user can get a better sense of what information the robotic arm is trying to convey back to its operator. Placing these devices at different locations on the operators arm will also help to make the feedback more intuitive by providing feedback on the operators arm and hand in the same location where the force is being applied to the robotic arm. 3.2.2. Motion glove and 3D laser camera controls utilizing piezoelectric feedback Figure 2. Motion glove with 3D laser and piezoelectric feedback Motion gloves often offer positional data of the users hand as well as finger joint state (how bent the finger is, if it is bent at all). The CyberTouch glove by CyberGlove Systems offers finger joint state, as well as provides vibro-tactile actuators for each finger and the palm. This glove, however, does not provide positional data. The positional data can be obtained through the means of a 3D laser camera. A 3D laser camera is a sensor which can provide 3D data through the use of laser range finders that scan up/down, and left/right. These cameras can be compared to stereoscopic cameras which use two lenses to record video and compare the two resulting video feeds to calculate distance to different objects in the field of view. Stereoscopic cameras generally require more processing time to calculate distance to an object compared to the 3D laser cameras, however the 3D laser cameras typically only offer range data, often resulting in more costly computations when detecting, classifying, or visualizing objects in the field of view compared to normal or stereoscopic vision. The CamCube 2.0 by PMD Technologies offers both 3D laser data in addition to a grayscale image in one device. The software can then pick and choose which data to use for different calculations in order to optimize performance. See the description for piezoelectric devices under Device Option 1 (Single camera control utilizing piezoelectric feedback) for an in depth description of the device. The CyberGlove would be worn on the users hand as per the instructions of the device. The 3D laser camera would be placed in a location that would allow it to see the operators arm and hand so that it would be able to track it properly. Proc. of SPIE Vol. 8045 80450C-6Downloaded From: / on 03/06/2014 Terms of Use: /terms The piezoelealready provi(elbow, wristcapable of seThe combinalaser camera detecting andup the finer mtactile feedba3.2.3. The CyberFowith a puppeexoskeleton pwhen activatefeedback thronext to the opThe CyberFowould be ablto the operatodevice wouldadditional forpotentially of3.2.4. The Mini-maarm that has buttons that cctric devices cided to the hant, and possibly ensing such eveation of a 3D lamight not pickd tracking the mmovements of tack that is builtMotion gloveorce device by et arm. The Cyproduces forceed. The puppeough resisting perator. orce device offele to provide foor and can bettd need to be plarce feedback thffering more inPuppet contraster puppet desimilar degreecan execute a sould be placednd by the Cybershoulder) or inents. aser camera andk up all the finemovements of the hand. The t into the glovee control utilizCyberGlove SyberGrasp glove feedback throet arm attachesmovement of tfers force-feedborce feedback tter represent thaced near the ohat would prevntuitive feedbarol with built-iFigure 4. Pupevice allows thes of freedom bsingle command on the users rTouch glove. n between joind a glove systee detail movemlarger body papiezoelectric de. zing built-in feFigure 3. Motystems is a comve offers fingerough a network to the glove anthe puppet armback with manyto the hands anhe current state operator due tovent the operatock to the operain force feedbppet arm with be operator to mbut is roughly thnd or a series of arm to help pro These devicesnts to offer feedem allows the tments that can barts such as a hudevices offer aeedback tion glove with mbination of twr joint state in ak of tendons whnd provides locms joints. The dy features alreand fingers, whicof the robotic o range constraior from movingator. ack utilizing vbuilt-in and vibmove and shapehe size of a joyf commands baovide feedbacks can be placeddback that the atwo devices to be performed buman arm. Thadditional feedb feedback wo technologieaddition to an ehich hinder thecation and rotadevice would bady built into tch could potenarm. The puppints (i.e. work g his or her armvibrating motobrating motor fee a robotic armystick. The devack to back. Vik in addition tod at each of thearm has hit an complement eby the hand, buhe glove on theback in conjunes the CyberGexoskeleton fore movement of ation data in adbe placed on a the hardware. Tntially offer mopet arm portionarea), althoughm in certain dirors eedback m by manipulativice also has seibrating motoro the feedback e joints of the aobject if the arach other. Theut it is capable other hand canction with the Grasp glove cor the hand. Thf the operators ddition to forcetable in front oThe Cyberglovore intuitive feen of the CyberFh the puppet arrections, againing a miniatureeveral programs provide feedbthat is arm rm is e 3D of n pick vibro-ombined his fingers e of or ve edback Force rm offers , e robotic mmable back to Proc. of SPIE Vol. 8045 80450C-7Downloaded From: / on 03/06/2014 Terms of Use: /terms the operator timplementatifront of or tooperators lapWhen the devinput. Motorcontact with motors couldput on or remBy having a mintuitive mancould offer a of knowing wrobotic arm i3.2.5. A two cameralgorithms. Hexperienced icamera contrThe two camthe user pointo make the cto help provishoulder) or or in the palmAnother confconfigurationperspectives,of cameras wof data cominBy utilizing aalso be applielocation of thintuitive wayin the place wthrough the vibion as the vibra the side of thep. Vibrating mvice receives inrs could also beobstacles, walld be attached tomove easily. miniature robonner while contmore intuitivewhere the robotis gripping an oTwo camera ra system is capHaving two cain a single camrol utilizing piemeras could be pnting straight docomputer visiode feedback. Tin between joinm to help convefiguration for an as the single , there is less owould offer bettng from differea two camera sed here. In addhe operators ary. The placemewhere the robobrations of oneating feature foe operator, mosmotors could be nput from the re placed in nonls, or anything o the user via stotic arm to mantrolling all the e way of not ontic arm is hittinobject or determcontrol utiliziFigurpable of determameras also sligmera solution. Sezoelectric feedplaced in frontown. Camerasn algorithms uThese devices nts to offer feeey how tight tha two camera scamera solutiof a chance thatter depth perceent angles. Thisystem, all the dition, the deptrm and hand. Tent of these devtic arm is expee or more smallound on most mst likely on a taplaced on the robotic arm, thn-joint locationelse provided traps, sewn intnipulate, an opejoints and the nly knowing thng something. mine if the objing piezoelectrre 5. Two camermining distanceghtly reduces thSee the descripdback) for an int of, or slightly s would need touseful. The piezcan be placed aedback that the he robotic arm olution would on. The second t the view of theption compares data could beobject trackingth calculationsThe piezoelectvices also conteriencing these l motors. Thismodern cell phable or flat surfusers arm andhe motors vibrans to help the uthe robot has tto existing cloterator could pomanipulators hat the robotic a Likewise, theect being heldric feedback ras with piezoele to an object thhe problem of ption for piezoen depth descripabove the usero be within a fezoelectric deviat each of the jarm has hit anis gripping an be to have onecamera wouldhe users arm woed to the singlee compared andg and recogniti that can be petric devices heltributes to the i sensations as w technology is ones. The miniface. The devid hand at each ate to varying duser feel if othethe proper sensthing, or built iotentially operapositions and rarm is hitting sse motors can is slipping outlectric feedbackhrough variousobscuring the velectric devicesption of the devr pointing at anew feet of eachces could be pljoints of the armn object if the aobject. e camera lookind look at the usould be obscure camera solutid a more accurion that a singlerformed offerslp to provide fointuitiveness ofwell. similar in conci-master deviceice could potenof the joints (edegrees based oer parts of the rsors to provide nto a sleeve/glate the real robrotations. Thesomething, but inform the opet of the robots k s computer visview of the cams under Devicevice. n angle down oh other pointinglaced on the usm (elbow, wrisarm is capable ng down at theser from the sidred in both viewon since there rate value can be camera soluts useful informorce feedback tf the system bycept and e would be plantially sit in theelbow, wrist, eton the values orobot are cominthis feedback.love that the usbotic arm in a m vibrating motoit would offer erator how hardgrip. ion processingmera that can be Option 1 (Sinon the user, or g in the same dsers hands andst, and possiblyof sensing suche user in the samde. With these tws. This configwould be two be derived. tion can performmation about theto the user in ay placing the feaced in e tc). of the ng in The ser could more ors a way d the g be ngle above direction d arms y h events me two guration sources m can e an eedback Proc. of SPIE Vol. 8045 80450C-8Downloaded From: / on 03/06/2014 Terms of Use: /terms 3.2.6. A puppet armmovement onjoints of the poffer a wide vmade to lookThe device wpuppet arm odirections, pooffer additionthe custom hhandle on theexample, if thend effector. 3.2.7. The AcceleglUSB 3D optihand and endvibrations of feature foundThe Acceleglwas designedcontrolling thits intended lhand at each varying degreif other parts sensors to probuilt into a slPuppet contrm such as the Pn the operator spuppet arm thavariety of handk and act like vwould be placedoffers additionaotentially offernal ways to intandles could be puppet arm che operator swMotion gloveFigure 7. 3D Flove by Anthroical mouse cand effector of thef one or more smd on most modlove would be d to operate on he elbow of thelocation with mof the joints (eees based on thof the robot arovide this feedleeve/glove tharol utilizing buFigure 6. Phantom by Senside. Often theat will prevent dles to further earious tools sud on a table in al force feedbaring more intuieract with the re swapped outould potentiallwitches the hande and 3D mousFinger mouse coTronix is equin be utilized to e robotic arm. mall motors. Tern cell phonesworn and opera users fingere robotic arm mminor changes telbow, wrist, ethe values of there coming in codback. The moat the user couluilt-in feedbacPhantom puppnsAble technolese devices canor hinder the menhance the exuch as scissors ofront of or nexack that would ptive feedback trobots surrount to match the cly send a signadle to a scissorse controls uticombined with aipped with sencontrol the moVibrating motThis technologs. rated per the inr, it could potemore intuitive. to the softwaretc). When the de input. Devicontact with obsotors could be ald put on or rem ck pet arm with bulogies offers a n provide feedbmovement of thxperience of usor pinchers. xt to the operatoprevent the opto the operatorndings. If the rcurrent end effeal to the roboticr handle, the roilizing vibratina motion capturnsors to track thovements of theors would thengy is similar in nstructions giventially be place The 3D opticae setup. Vibratidevice receivesces could also bstacles, walls, oattached to the move easily. ilt in force feedreliable, accurback in the formhe operators asing the puppetor due to rangeerator from mor. As an optionrobotic arm wafector on the roc arm to changeobotic arm coulng motors re glove with vihe motions of te robotic elbown provide feedbconcept and imen with the deved on the useral mouse coulding motors cous input from thbe placed in noor anything elsuser via strapsdback rate way of senm of force resiarm and hand. Mt arm. These he constraints (ioving his or henal add-on, the as capable of cbotic arm. In ae to the matchild change tools brating motor fthe operators hw while the Acback to the opemplementation vice. While thes arm near thed be worn on thuld be placed onhe robotic arm, on-joint locatiose provided thes, sewn into exnsing position aistance appliedMore recent adhandles can be c.e. work area).er arm in certaincustom handlechanging end efaddition, changing end effectos to the wire cufeedback hand and fingeceleglove conterator through tas the vibratine 3D optical moe elbow to makhe users fingern the users armthe motors vibons to help the ue robot has the isting clothingand d to the dvances custom The n es would ffectors, ging the or for utting ers. A trols the the ng ouse ke r as per m and brate to user feel proper g, or Proc. of SPIE Vol. 8045 80450C-9Downloaded From: / on 03/06/2014 Terms of Use: /terms 3.2.8. The Maglev inside a basinmechanical rthe magneticthe end of thehandle can beeffector as wother means, calculated banext to the usnearby. 3.2.9. The SpacePilOffering 6 deactions. This actions or scrprovide the autilizing vibroffers a wide3.2.10. 3D Joystick w200 is a joysticn and can detecresistance or fri force within the stick there are replaced withwell as specify tsuch as pressiased on the locaser on a table. I3D Mouse utlot Pro is a 3D egrees of freeddevice also haripted routinesappropriate feedrating motors). e range of optioMotion trackFigurewith built-in foFigure 8.ck with built inct 6 degrees of iction that mighe basin to pusre buttons which other handlesthe orientation ng a button to ation and orienIn addition to tilizing vibratiFigmouse that is dom, this mouseas several prog. While this dedback in a simWith the puckons. king utilizing pe 10. MotionStaforce feedback. Maglev 200 jon force feedbacf movement as ht occur througsh the stick in ach could be utils of varying deof the end effeswitch controlntation of the ethe basin, an acing motors ure 9. SpacePilcommonly usee has a puck rammable buttevice does not pmilar manner to k interface, thispiezoelectric fr (wireless) mo k ystick with builck. The stick pothe user movegh normal use a particular dirlized to open aesigns. With 6 dector. Assuminl modes, the locnd effector. Thccompanying cot Pro with vibed in conjunctiothat is moved,tons available tprovide any haDevice Options device needs feedback tion capture de lt-in force feedbortion of the jos the stick arouof the device. rection based oand close the grdegrees of movng the wrist mocation and oriehe Maglev woucomputer and prating motorson with 3D des, pushed, pullethat would alloaptic feedback, n 4 (Puppet convery little workevice with piezoback oystick is magnund. This negatFeedback is acn the input froripper. Similar vement, the useovement is dictentation of the uld need to be ppower supply n sign software sd, and rotated ow the user to cvibrating motontrol with builtkspace, small a electric feedbacnetically levitattes any unwantchieved by incrm the robotic ato the Phantomer could fly thetated through selbow could bplaced in frontneed to be placsuch as SolidWto achieve the customize diffeors could be ust-in force feedbamounts of powck ted ted reasing arm. At m, this e end ome e t of or ed Works. desired erent sed to back wer, and Proc. of SPIE Vol. 8045 80450C-10Downloaded From: / on 03/06/2014 Terms of Use: /terms The MotionStar (wireless) device is a motion capture device that is worn like a vest. Out of the vest, several accelerometers are placed at different locations on the users arms, hands, and other body parts that you wish to track. This type of device is often used in the film industry to capture the movements of an actor so that artists can control the movements of an animated character. Piezoelectric devices can be added to this system to achieve haptic feedback in a similar manner as Device Option 1 (Single camera control utilizing piezoelectric feedback). This motion capture vest could be built right into existing clothing or a uniform which would reduce setup and breakdown times as well as make the system more portable. A separate computer processes the signals from the MostionStar device and translates them into the appropriate resulting actions. 4. ANALYSIS AND WEIGHTING OF MISSION NEEDS To help evaluate the candidate technologies, we developed categories and metrics based on user needs and the required control. We then produced a weighting for each category and each individual metric within a category. Table 1 shows an example of this weighting for a few categories. These weightings are stored in a spreadsheet to allow them to be modified as appropriate, but together they identify the most important aspects of a given option. Any modifications to weights or categories will impact all downstream metric evaluations. Table 1. Sample table of categories, metrics, and weights to evaluate system options Category Weight - 4 5 3 Ruggedness Usability Risk Physical durability of system 10 Physical durability of interaction point7 Connection ports 4 External conditions 10 Heads up control 8 Impact of tether (wired/wireless) 7 Workspace 5 Expandable 2 Adaptable to mission 6 Battery life 4 Degrees of freedom 3 update rate 5 resolution of movement 7 State of development 10 Ease of integration 5 Additional dev needed 3 The full list of categories (Measures of Effectiveness) include Ruggedness, Impact to Warfighter, Intuitiveness, Cost, Power, Usability, and Risk which are all described in more detail below. Each of these categories is given a relative weight of its importance compared with the other categories (e.g. a weight of 4 is twice as important as a weight of 2). The rows of the table represent the measurable metrics (Measures of Performance). As with the categories, each metric is given a weight to show its relative importance with respect to other metrics within a category. We developed these weights by first identifying which metrics are most relevant to the evaluation of a category and giving those metrics a weight of 10. Each other metric affecting a category was then evaluated for its importance relative to the highest scoring metrics. To the extent possible, we selected metrics and categories that were not correlated to prevent artificially overweighting elements of the evaluation space. 4.1. Criteria for device option evaluation: In order to evaluate the device options, some criteria needed to be put together that represented what a device option should be able to offer to the operator. These criteria need to be more specific than general terms such as “help the Proc. of SPIE Vol. 8045 80450C-11Downloaded From: / on 03/06/2014 Terms of Use: /terms operator complete a mission” or “it has to be rugged” but at the same time the criteria needed to be more generic than such terms as “it needs to be able to allow the operator to unplug a power cord from a wall socket.” The following categories and sub-criteria were created: 4.1.1. Usability Usability refers to how simple or straight-forward the system is to use. This is different from intuitiveness; intuitiveness refers to how easily an untrained operator is able to use the system while usability refers to how efficient and effective a trained operator is on the system. External conditions How sensitive the system is to light, rain, sand, snow, heat/cold, and other environmental conditions. For example, laser-based systems may not work well under bright light and vision-based systems may require light to work. Heads up control Is the user able to look around and be aware of his or her surroundings while also operating the robot? Does the Warfighter needs undivided attention to controlling the UGV? Impact of tether (wired/wireless) Will any wires impede the movements of the user? Workspace The amount of space that is required for a user to use the device. Expandable The ability to expand to have one or more users control multiple arms. Will this device be able to control current and future robotic arms? Adaptable to mission The ability for a device to be flexible enough to handle multiple missions with varying objectives. Degrees of freedom Can a user control all the degrees of freedom that the robotic arm has to offer? Update rate The rate at which data is collected, processed, and fed to the robot as commands from the control device or how quickly a haptic device reacts to the robot touching an object. Resolution of movement Can the user control the robotic arm with both fine and broad movements? 4.1.2. Intuitiveness The best overall solution will be effective because it performs in an intuitive way. In other words, the system should require a minimum of training for a user to be effective. Intuitive a subjective evaluation of how close the solution is to natural human movement or to known established systems. Feedback intensity How strong of a feedback sensation is provided? Will the user feel the feedback in stressful situations? Will the feedback be too strong and prevent the user from effectively using the control portion? Feedback type How appropriate is the feedback? For example, a vibration in the hand might mean the robot is holding something or it might mean the object is slipping out of the robots grasp it is not intuitive which of these options the vibration is trying to convey to the user. Location of feedback Where the feedback is applied (finger, wrist, arm, etc.) has an effect on how intuitive that feedback is to the user. 4.1.3. Impact to Warfighter The Impact to Warfighter category focuses on detrimental impacts on a Warfighters ability to complete the overall mission. Dedicated use of hands Does the device require the users hands to be dedicated to the device or are they capable of doing other things if needed? To switch to another task, does the user need to put down the device, take it off, or can they simply walk away? Setup/teardown The amount of time it would take to setup and tear down the system, preferably in a hurry. Proc. of SPIE Vol. 8045 80450C-12Downloaded From: / on 03/06/2014 Terms of Use: /terms Signature Anything that produces a detectable signature, such as light, sound, or RF that might give away the Warfighters position. Posture constraints Required range of motion. Whether the Warfighter can be prone and use subtle movements to control or if Warfighters need full range of motion for arm movement. System size The room needed for all of the hardware involved in the system. Ease of transport (awkwardness) How easy it is to transport the system. Weight The total weight of the entire system 4.1.4. Ruggedness Ruggedness attempts to measure the solutions durability and appropriateness to withstand the harsh environment of a battlefield environment. Physical durability of the system Overall, how durable is the device option? Physical durability of the interaction point The point of interaction will likely receive the most use and thusly needs to be as durable if not more durable than the rest of the system. Connection ports A combination of how resistant the connection ports are to the elements, vibrations, and how easy it is to disconnect. 4.1.5. Cost Cost is a consideration for this stage of analysis, but receives a low category weight because it is early in the research and development phase. By including cost in the considerations, the relative cost of solutions is monitored and can be weighted more heavily as we move to fielded solutions. However, the main focus of this early research is on the technology and mission appropriateness. Cost - The overall system cost. 4.1.6. Power Because we are not sure of the final location of where this system will be used (i.e. from on-board a vehicle or remote from a vehicle), we wanted to include an evaluation category that looked at the required power to run the solution. As with cost, this category is given a fairly low weight for this stage of analysis since the primary focus is on the technologys appropriateness for fielded military use, but we included this in the analysis so the overall impact of power can be evaluated as the final concept of operation is more clearly defined. The evaluation of these metrics was often fairly subjective since this is early in the system design and many of the component technologies can be modified to accommodate different batteries and can be optimized for available power sources. Battery life - Related to battery capacity and power consumption. Battery type - Is it a commonly available battery? Power consumption The overall power consumption of the system. 4.1.7. Risk Risk is the category we use to evaluate the overall system risk. This includes risk elements involved with the state of development of individual components included in the solution as well as risk involved with any middleware or glue required to integrate different components into an overall system. The low weight on the risk component is not an indicator that this is not an important criterion for fielded systems, but is indicative of the fact that these are prospective solutions that are a long way from fielding so our primary focus is on evaluating the merits of the technologies and additional hardening and risk reduction techniques can be used later in the development process to reduce overall program risk. State of development A COTS device that is widely used and proven is ideal Proc. of SPIE Vol. 8045 80450C-13Downloaded From: / on 03/06/2014 Terms of Use: /terms Ease of integration Will this device be easy to integrate with the rest of the system including software, electrical, and mechanical aspects? Additional development needed Is there additional development that is needed before the device can be productized? 4.2. Analysis of combined components To evaluate the relative value of each of the potential solutions, we had a team of internal engineers independently research the various options proposed and score each of the potential solutions against the metrics described in the previous section. For each solution, the engineers ascribed a score between 0 and 10 for each metric of each solution (10 being “good”). The spreadsheet application then applied the weights for each metric and each category to the scores given and normalized the responses as an overall score between 0 and 100, where 100 would represent a perfect solution. The final order of ranked solutions is shown in Table 2. Table 2. Final ordered rank of solutions Rank Option Description (controls / feedback) Score (%) 1 Single cam / piezoelectric 92.0 2 Two cams / piezoelectric 91.6 3 CyberTouch, 3D cam / piezoelectric 88.7 4 Mini-master / vibrating motors, Mini-master 87.9 5 CyberForce / CyberForce 74.6 6 Acceleglove, 3D finger mouse / vibrating motors 67.9 7 Phantom / Phantom 67.7 8 Xbox controller / vibrating motors 61.4 9 MotionStar (wireless) / piezoelectric 60.0 10 SpacePilot Pro / vibrating motors 56.3 11 Maglev 200 / Maglev 200 51.9 4.2.1. Recommended system solutions Based on the evaluation of the individual components as well as the set of potential system solutions that weve systematically evaluated, moving forward with two systems for further evaluation is recommended. The CyberTouch glove with the 3D cam combined with piezoelectric feedback and the high resolution camera (both single and double) with piezoelectric controls have excellent benefit and should be explored further before a final decision is made. Exploration of the major components of these approaches during Phase I experimentation has begun. This experimentation is discussed in the following section. Conducting a more detailed hands-on evaluation of the CyberForce and Mini-master technologies to better understand their capabilities and limitations is also recommended. This detailed research into leading solutions from Phase I will be conducted as part of the Phase II effort. Proc. of SPIE Vol. 8045 80450C-14Downloaded From: / on 03/06/2014 Terms of Use: /terms 5. CONTROL INPUT DEVICE EXPERIMENTATION A control input device experiment was performed to help reduce the risk of the project while exploring a variation on one of the potential solutions mentioned in the previous section. Along with a description of the experiment, the feedback system is described below. 5.1. Video processing arm control There were several challenges we set out to address with video processing. The first was determining how accurately we could control the manipulator with only one camera. Another critical aspect was determining how intuitive this type of control can be to an operator, especially with haptic feedback. A third element of the experiment was determining whether a glove was needed for accurate control. Figure 11 shows the basic setup for our experiment. In this setup, we placed a single web camera on a stand looking directly down on a white background. The user places his hand in the cameras view and performs a short (less than 30 seconds on average) calibration process. Since the background is white, the software can easily detect and track the users hand and future versions of the software can potentially find a users hand in cluttered environments. With the software properly tracking the users hand, the calibration routine asks the user to move his or her hand to a position that maps to a downward movement of the robotic arm at full speed. The user then moves his or her hand up towards the camera to a position that will be considered an upward movement of the robotic arm at full speed. During this calibration process, the software is recording both the length and width (in pixels) of the users hand. With a maximum downward movement value and a maximum upward movement value, the software can calculate a minimum downward movement value and a minimum upward movement value which will allow the user to have a dead zone in the work area where the robot arm will not be commanded to move. These minimum values are the points in which the robot arm goes from not moving at all to moving at its slowest speed. This concept is similar to the one used for left, right, forward, and backward movements explained later on. Figure 11. Video Processing experiment used to explore intuitive manipulator control. Once calibrated, the user can move his or her hand up, down, left, right, forward, and backward. The software detects the users hand and tracks the movements, translates the movements into velocity commands, and signals the tip of the robotic arm to move in the direction of the operators hand. Future versions of the software will allow for a positional mode in addition to the current velocity mode. The distance away from the camera is calculated by comparing the width of the users hand against the values obtained in the calibration routine. Since the vertical distance between the maximum downward movement and the maximum upward movement are relatively close to each other from the perspective of the camera, there is no need to do any fancy calculations to determine how far the users hand is from the camera. A simple linear relationship of the perceived width of the users hand (in pixels) vs. the distance away from the camera is sufficient enough to control the robotic arm with fairly good accuracy. Also, there is no need to translate these values into real world units such as inches or millimeters the process is perfectly content in using pixels as its units throughout all the calculations. Proc. of SPIE Vol. 8045 80450C-15Downloaded From: / on 03/06/2014 Terms of Use: /terms The tracking arm to open ausers hand (the open posithe ratio of thclenched in athus commanFigure 12 shoThe computeare being senpoint is locatspeed that theusers hand tomovement bobox and towahand reachesline show thealong the scaA simple hapforce sensor minto differentenhance the vthe end withothe straps secthe metal rodsoftware is alsand close the g(fingertip to writion as seen inhe length of thea fist, the ratio nding the robotows the compuer interface offent to the roboticted which resule robotic arm co help identify ox, the robotic ards the outer bs the outer box.e perceived speale the users haptic feedback dmounted in thet levels of intenvibrations of thout the ball. Thcure the vibratid to help keep tFigure 1allowed algorithso capable of dgripper accordirist) vs. the widn Figure 12. The users hand vof length vs. wt hand to close uter interface ders informationc arm. A gray blts in the robotican move in thwhere trackingarm will start box, the robotic. We use a simeed of movemeand is calculatedevice was cone finger of the mnsity for a vibrhe motor (Figuhe operator attaing ball device them out of the12. The computeus to visualize hms were workindetecting when ngly. This procdth of the userhis ratio of lengvs. the width iswidth drops to bas well. developed for vn about where box is drawn oic arm not move forward/backg is occurring. to move very sc arm increaseilar scaling verent vertically. Aed to be. Figure 13. Prostructed to promanipulator serating motor. Aure 13). This baaches this devicto the users we way of the us er interface forhow well the imng. the users hancess is accomps hand. Duringgth vs. width w checked againbe closer to a 1visualizing the the applicationon the center ofving. Likewisekward and left/If the user moslowly. As the s speed until itrtically, but insA thick verticaototype Haptic Fvide feedback ends signals baA small vibratinall is mounted oce to his or herwrist and forearser. For the exp r the experimenmage processingnd is open or clplished by lookg the calibratiowas recorded annst this calibrat1:1 ratio, and thcritical elemenn thinks the usef the screen to ie, a second larg/right directionoves his hand slusers hand is t reaches its mastead of using aal purple line al Feedback to the user whack to the contrng motor is plaon a metal rod r arm such thatrm. The wires perimental applt g osed and can cking at the ratioon routine, the und during normted value. If thhe hand is classnts of the videoers hand is locindicate whereger box represens. A third box lightly outside moved furtherax speed at thea box, vertical long this horizohile the robotic rol computer waced in a small which has hoot the ball sits infrom the vibralication, these command the ro of the length users hand is kmal use of the she users hand sified as being o processing socated and what e the center or zents the maximuis drawn arounthe first, innerr away from th point where thhashes on a hoontal line showarm grips an owhere it is transrubber ball to ok and loop strn the users palating motor runwires run fromrobotic of the kept in software, is closed, oftware. values zero um nd the r no-e inner he users orizontal ws where object. A slated help aps on lm and n along m the Proc. of SPIE Vol. 8045 80450C-16Downloaded From: / on 03/06/2014 Terms of Use: /terms arm to a unit resting on the test platform. A fielded version of this system would be battery operated and wireless to prevent the need for this tether. Force values are read from a sensor mounted on the finger tip of the robotic arms gripper. The intensity of the vibrating motors corresponds to how tight the grip is on an object. This experiment clearly demonstrated the potential of using a single camera for control of a robotic arm. In our experiment, the operator was able to keep his attention focused on the mission while intuitively controlling the manipulator and gripper as if it were his own arm and hand. Aided by the haptic feedback through the prototype palm device, the operator was able to select a plastic bottle from the table, pick it up, and set it down on the other end of the table. He repeated this exercise several times. For this to be a viable solution, we will need to incorporate additional sensors in the fingers (perhaps through a glove) and be able to operate in a more cluttered environment creating a more challenging processing requirement for the image analysis. This system is also limit
- 温馨提示:
1: 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
2: 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
3.本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。

人人文库网所有资源均是用户自行上传分享,仅供网友学习交流,未经上传用户书面授权,请勿作他用。