




已阅读5页,还剩2页未读, 继续免费阅读
版权说明:本文档由用户提供并上传,收益归属内容提供方,若内容存在侵权,请进行举报或认领
文档简介
Abstract The present study aims at investigating how eye contact established by a humanoid robot affects engagement in human-robot interaction (HRI). To this end, we combined explicit subjective evaluations with implicit measures, i.e. reaction times and eye tracking. More specifically, we employed a gaze cueing paradigm in HRI protocol involving the iCub robot. Critically, before moving its gaze, iCub either established eye contact or not with the user. We investigated the patterns of fixations of participants gaze on the robots face, joint attention and the subjective ratings of engagement as a function of eye contact or no eye contact. We found that eye contact affected implicit measures of engagement, i.e. longer fixation times on the robots face during eye contact. Moreover, we showed that joint attention was elicited only when the robot established eye contact, whereas no joint attention occurred when it did not. On the contrary, explicit measures of engagement with the robot did not vary across conditions. Our results highlight the value of combining explicit with implicit measures in an HRI protocol in order to unveil underlying human cognitive mechanisms, which might be at stake during the interactions. These mechanisms could be crucial for establishing an effective and engaging HRI, and provide guidelines to the robotics community with respect to better robot design. I. INTRODUCTION A. Measuring engagement in HRI Engagement with a robot partner affects the initiation, maintenance, and end of the interaction and thus, it is a crucial factor in successful and natural human-robot interaction (HRI) 1. Therefore, it is imperative to address the issue of engagement in HRI research. As stated in 2, p.1: “Engagement is a category of user experience characterized by attributes of challenge, positive affect, endurability, aesthetic and sensory appeal, attention, feedback, variety/novelty, interactivity, and perceived user control”. Studies that have examined the aspect of engagement in HRI used both explicit e.g., 3-5 and implicit measures 6-13. Explicit measures and questionnaires while providing valuable hints regarding the phenomenon of interest, suffer from several limitations. First, they rely on explicit reports, meaning that participants need to be able to consciously assess their inner states. Furthermore, explicit measures are dependent on introspective abilities and interpretation of the questions and can be prone to various biases, such as social desirability effect 14. Finally, explicit responses are not sufficiently informative with respect to specific cognitive mechanisms involved, which are implicit and automatic, and *The project has received funding from the European Research Council (ERC) under the European Unions Horizon 2020 research and innovation programme (grant awarded to AW, titled “InStance: Intentional Stance for Social Attunement”. Grant agreement No: 715058). All authors are with the Istituto Italiano di Tecnologia, Genova, 16145. K.K. is affiliated with Ludwig-Maximilians-universitt (DE). A.W. is an thus not necessarily accessible to conscious awareness. In natural interactions, people are often not aware that their brains employ certain mechanisms and processes. However, thanks to the careful design of experimental paradigms inspired by research in cognitive science that target specific cognitive mechanisms, we can collect objective implicit metrics and draw conclusions about what cognitive processes are at stake 15-17. Typically, psychologists use performance measures (e.g., reaction times, and error rates) to study mechanisms of perception, cognition, and behavior, and also the social aspects thereof: for example, joint attention e.g., 15-22, or visuospatial perspective taking 23-24. As such, these measures have informed researchers about the respective cognitive processes with high reliability, and without the necessity of participants being aware of the processes under investigation. In addition to performance measures, researchers have also widely used other implicit measures behavioral (e.g., eye tracking or motion capture) or neurophysiological/neuroimaging: for example, electroence- phalogram (EEG), Galvanic skin response (GSR) or functional magnetic resonance imaging (fMRI) 25. Those measures provide a valuable source of information regarding neural and physiological correlates of behavior. B. Joint attention as a measure of engagement in HRI One implicit measure of engagement in social interactions is joint attention (JA). JA occurs when two agents direct their focus of attention to the same object or event in the environment. This fundamental mechanism is a basis for many other complex processes involved in social interactions 26-30, like referential communication. In fact, an impaired ability to engage in JA has been reported in the case of individuals diagnosed with autism spectrum disorder 31. In human-computer interaction (HCI) and HRI research, JA has been postulated to be a marker of engagement 6, 32. For instance, Anzalone et al. used JA among other dynamic metrics (synchrony, imitation) to evaluate engagement in HRI 6. Peters et al 32 defined the level of engagement between a user and virtual agent by measuring JA- i.e. how much the user has been looking at objects looked at or pointed by the virtual agent. Moreover, Kasari et al. 33 showed that JA mediated interventions increased engagement of toddlers during interaction with caregivers. Researchers in cognitive psychology have operationalized JA in the form of the gaze cueing paradigm 18-19. This is adjunct professor in Engineering Psychology, University of Technology, Lule. Corresponding authors: A.W., phone: +39 020 8172 242, email: agnieszka.wykowskaiit.it, K.K., email: kyveli.kompatsiariiit.it. Measuring engagement elicited by eye contact in Human-Robot Interaction Kompatsiari K., Ciardo F., De Tommaso D., Wykowska A. 2019 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) Macau, China, November 4-8, 2019 978-1-7281-4003-2/19/$31.00 2019 IEEE6979 an attentional task in which participants are presented with a face on the computer screen. The face initially has either eye closed or directed straight ahead. Subsequently, the direction of the gaze is shifted to one of the sides of the screen the gazed-at or a different location. Participants task is to determine either targets identity or simply respond to its presence. When participants “engage” in JA with the “gazer” they attend to where the gazer shifts his/her eyes. Therefore, detection/discrimination of any target at the gazed-at location is faster and more accurate than at the other locations, this effect is known as the cueing effect (GCE), and it is considered a behavioural index of JA. Recent studies showed that the GCE can be elicited in naturalistic and ecologically valid paradigms and that it is reflected, apart from performance measures, also in EEG 34-36, fMRI 37-39, and eye tracking 40-41 measures. Here, we would like to additionally focus on eye tracking as an implicit measure of engagement 6-7, 12, as eye movements are particularly informative with respect to attentional processes 42. In the context of social interaction, eye movements not only are informative with respect to the individuals attentional focus, but they are also signaling to others where attention is oriented. As such, they are one of the most important social signals with which we convey our inner mental states 28. Despite our sensitivity to gaze shifts, the contribution of other cues to our attentional orienting should not be downplayed, e.g., head orientation and body posture 43-44. C. Aim of study and related work In this study, we aimed at examining whether eye contact established by the iCub robot 45-46 would influence engagement in HRI, measured by two implicit objective markers: JA (by means of the GCE) and patterns of fixations on the face of the robot during eye-contact. Eye contact is one of the most important social signals communicating the intention to engage in an interaction. Indeed, eye contact between humans has been shown to affect various cognitive processes such as attention or memory, and also physiological states, for example, arousal 47-49. In the context of HRI, research examining the effect of eye contact mainly focused on subjective evaluations of the robot 50-55, and how it is related to engagement 11. In the present study, we address for the first time the impact of eye contact on two different implicit measures of engagement: the GCE and patterns of fixations on the robot face. Such measures should allow for more in-depth analysis of the cognitive mechanisms that are affected by eye contact in HRI. Kompatsiari et al. 17 showed that eye contact established by a robot influences JA in the sense that larger GCE has been observed for eye contact condition, as compared to no eye contact condition. However, it remains to be examined and understood what specifically causes this effect. Is it because eye contact has a “freezing” effect on attentional focus, thereby causing longer disengagement times from the robot face and longer time to reallocate 1 attentional focus to a different location? Or perhaps there are some other attention mechanisms at stake? In the current study, we address this question by employing an eye tracking methodology and investigating the patterns of fixations on the robot face in the context of eye contact and no eye contact. Answering the question of precisely what cognitive mechanisms are affected by eye contact is not only of theoretical interest, but it has also implications for robot design. If eye contact attracts attention to the face of the robot to the point that it creates delays in disengagement, it might be a positive factor for social interaction and engagement, but might impair performance in other tasks where a reallocation of attentional focus is critical. II. METHODS A. Participants In total, twenty-four healthy adults (mean age = 25.25 4.01, 9 female, 2 left-handed) took part in the experiment. All had normal or corrected-to-normal vision, and they received an honorarium of 15 euros for taking part in the experiment. They were all naive with respect to the purpose of this study, and they were debriefed at the end of the experimental session. The experiment was conducted at the Istituto Italiano di Tecnologia (Genoa, Italy). Written consent was taken from each participant before the experimental session. The study was approved by the local ethical committee (Comitato Etico Regione Liguria). B. Stimuli and Apparatus The experiment was performed in an isolated and noise- attenuated room. Participants were seated opposite of iCub, at the other side of a desk, while their eyes were aligned with iCubs eyes. The target stimuli were letters V or T (3 32 high, 4 5 wide) and they were presented at two screens (27 inches), laterally positioned on the desk (75 cm apart, centre- to-centre). The screens were tilted back (by approximately 12 from the vertical position) and were rotated to the right (right screen) or left (left screen) by 76. iCubs gaze was directed to five different Cartesian coordinates: resting towards a point between the desk and participants upper body, eye contact towards participants eyes, no eye contact towards the desk, left towards the left screen, and right towards right screen see 17, 52 for a similar procedure. We used the iCubs gaze controller 56 for controlling the robots gaze, specifically the eyes and the neck. The controller uses inverse kinematics to find the eyes and necks poses for looking at desired Cartesian coordinates in the robots frame. In addition, it produces joints movements that follow a minimum-jerk velocity profile. The trajectory time for the movement of eyes and neck was set to 200 ms and 400 ms respectively. The vergence of the eyes was set to 3.5 degrees and maintained constant. The participants eyes were detected by the robot stereo cameras using a face detector algorithm1. When the eyes were not detected by the algorithm, the robot was programmed to look straight. Since participants were 6980 seated face-to-face with iCub and their eyes were aligned with iCubs eyes, this procedure ensured the establishment of eye contact even in the rare case of the algorithms failure. The Cartesian coordinates of the target positions were defined according to predefined values of pitch, roll, and yaw of the necks joints. These angles were selected adequately in order to ensure balanced joints displacements between conditions, i.e. a displacement of 12 in the pitch between resting-eye contact and resting-no eye contact, a displacement of 27 in the yaw, 12 in the pitch and 7 in the roll between eye contact-left or right and no eye contact- left or right. Table 1 shows the desired and measured angles of the neck. C. Procedure A full experimental session lasted about 40 minutes. Participants were instructed to fixate at the robots face while performing the task. The sequence of events was the following: Each trial started with the robot having its eyes closed at the resting position. After 2 s, the robot opened its eyes for 500 ms. During this time, the robot extracted information related to the position of the face and the eyes of the participant without making any movement. Then, it looked either to the predefined position: down, for the condition with no eye contact, or direct, to the eyes of a participant in the eye contact condition. After the movement was completed, iCub fixed its gaze to the same position for 2 s. This means that the eye contact/no eye contact duration was 2 s. Subsequently, the robots head and eyes shifted to either the left or the right screen. Head direction was not predictive with respect to target location (i.e. cue-target validity = 50%). After 1000 ms of the onset of the robots gaze shift, a letter appeared on one of the lateral screens. After 200 ms, the screens turned blank until the participants response. The trial expired if participants did not 2 reply within 1500 ms. The experiment consisted of 16 blocks of 16 trials each. A block was assigned to eye contact or no eye contact condition. The order of the blocks was counterbalanced across participants, starting either with a no eye contact block or with an eye contact block. Cue-target validity was randomized across blocks (i.e. cue-target validity = 50% in each block). At the end of each block, participants were asked to rate their engagement level with the robot on a 10-point Likert scale (1 = Strongly not engaged; 10 = Strongly engaged). D. Eye Tracker recordings Eye movements were recorded using a wearable eye tracker Tobii pro glasses 22 at 100 Hz. The head unit of Tobii pro glasses comprises of two eye cameras per eye, allowing for the recording of pupil positions binocularly. The eye tracking technology is based on pupil center corneal reflection (PCCR) and dark pupil tracking. A full-HD scene camera (1920 x 1080 pixels at 25 fps) is embedded in the head unit with a field of view of 90, 16:9. E. Analysis 1) Exclusion Criteria Three participants were excluded from the analysis due to eye movement recording issues, i.e two recordings could not be opened with the Tobii pro lab software, and in one recording the iCubs face was not fully inside the field of view of the participant. One participant was excluded from the analysis, as s/he failed to follow task instructions (i.e. % of fixation on iCubs face was at the chance of level). The analysis was run on a final sample size of N=20. 2) Eye Tracker Firstly, we defined our Area of Interest (AOI) as iCubs face. The AOI was defined independently for data collected across the two experimental conditions since the image of iCubs face is different (eye contact: looking straight, no eye contact: looking down). Participants raw gaze data were mapped inside or outside the desired AOI using the default mapping algorithm of Tobii Pro lab. Fixations were extracted using the default parameters of the fixation filter in Tobii Pro lab for the majority of the parameters (Tobii I-VT fixation filter 57). Specifically, the gap fill-in interpolation was not applied, the noise was removed by a moving median filter of 3 samples, the window length of the velocity calculator was set to 20 ms, the velocity threshold was set to 30/s, and adjacent fixations were not merged. However, we lowered the threshold of the default value regarding the minimum fixation duration from 60 ms to 30 ms in order to extract also very short fixations. For each trial, we extracted the number of fixations within the AOI and their duration in ms for the gaze condition phase (i.e., the time between resting and lateral movement equal to 2000 ms). If the trial belonged to the eye contact condition, the data were mapped to the AOI of iCub looking straight. In the same way, if the trial belonged to the no eye contact condition, the data were mapped to the AOI of iCub looking down. Paired sample t-tests were performed to test the Table 1. Robots gaze positions. EC represents eye contact, no EC represents no eye contact Desired Positions roll pitch Yaw Resting 0.0 -12.0 0.0 EC 0.0 0.0 0.0 No EC 0.0 -24.0 0.0 Left -7.0 -12.0 27.0 Right 7.0 -12.0 -27.0 Measured Positions roll pitch Yaw Resting 0.09 0.05 -12.75 0.02 -0.02 0.08 EC 0.14 0.08 -0.28 0.59 -0.19 0.86 No EC -0.04 0.03 -24.01 0.05 -0.002 0.01 Left -7.18 0.08 -12.13 0.08 27.56 0.12 Right 6.93 0.03 -12.05 0.05 -27.65 0.11 6981 statistical difference between eye contact and no eye contact conditions regarding the percentage of fixations and the fixations duration inside our AOI, i.e. iCubs face. 3) Behavioral Data The errors were 3.2% 2.1
温馨提示
- 1. 本站所有资源如无特殊说明,都需要本地电脑安装OFFICE2007和PDF阅读器。图纸软件为CAD,CAXA,PROE,UG,SolidWorks等.压缩文件请下载最新的WinRAR软件解压。
- 2. 本站的文档不包含任何第三方提供的附件图纸等,如果需要附件,请联系上传者。文件的所有权益归上传用户所有。
- 3. 本站RAR压缩包中若带图纸,网页内容里面会有图纸预览,若没有图纸预览就没有图纸。
- 4. 未经权益所有人同意不得将文件中的内容挪作商业或盈利用途。
- 5. 人人文库网仅提供信息存储空间,仅对用户上传内容的表现方式做保护处理,对用户上传分享的文档内容本身不做任何修改或编辑,并不能对任何下载内容负责。
- 6. 下载文件中如有侵权或不适当内容,请与我们联系,我们立即纠正。
- 7. 本站不保证下载资源的准确性、安全性和完整性, 同时也不承担用户因使用这些下载资源对自己和他人造成任何形式的伤害或损失。
最新文档
- 2025年大数据分析师入门考试模拟题集
- 2025年中医养生保健康复技术认证考试模拟题集
- 2025年专业岗位模拟题医疗器械销售与推广岗位招聘考试试题库
- 2025年色母料项目发展计划
- 2025年抗倍特板合作协议书
- 2025年新型便携式温、湿、风向风速仪项目发展计划
- 2025-2026学年北师大版(2024)小学数学三年级上册《身高的增长》教学设计
- 2025年SKI系列二甲苯异构化催化剂项目建议书
- 河北省唐县第一中学2025-2026学年高二上学期开学物理试题
- 抗疫为主题的课件
- 2024中国地质大学(武汉)辅导员招聘笔试真题
- 科创板开户测试题及答案
- 智能书柜阅读活动方案
- 治安防范培训课件
- 寿司下周活动方案
- 带状疱疹护理业务查房
- 地面维修液压支架故障排除技术措施
- 呼吸专科护士培训课件
- 2025年(数学学科)中考模拟测试卷(三模)附参考答案
- 2025年人教PEP版(2024)小学英语四年级上册(全册)教学设计(附目录)
- 转租养殖场地合同范本
评论
0/150
提交评论