<?xml version="1.0" encoding="UTF-8"?><xml><records><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>47</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">John C Murray</style></author><author><style face="normal" font="default" size="100%">Lola Cañamero</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Developing Preferential Attention to a Speaker: A Robot Learning to Recognise its Carer</style></title><secondary-title><style face="normal" font="default" size="100%">Proc. 2009 IEEE Symposium on Artificial Life (ALIFE 2009)</style></secondary-title></titles><dates><year><style  face="normal" font="default" size="100%">2009</style></year><pub-dates><date><style  face="normal" font="default" size="100%">03/2009</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://ieeexplore.ieee.org/document/4937697/</style></url></web-urls></urls><publisher><style face="normal" font="default" size="100%">IEEE Press</style></publisher><pub-location><style face="normal" font="default" size="100%">Nashville, TN</style></pub-location><pages><style face="normal" font="default" size="100%">77–84</style></pages><isbn><style face="normal" font="default" size="100%">978-1-4244-2763-5</style></isbn><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">In this paper we present a socially interactive multi-modal robotic head, ERWIN - Emotional Robot With Intelligent Networks, capable of emotion expression and interaction via speech and vision. The model presented shows how a robot can learn to attend to the voice of a specific speaker, providing a relevant emotional expressive response based on previous interactions. We show three aspects of the system; first, the learning phase, allowing the robot to learn faces and voices from interaction. Second, recognition of the learnt faces and voices, and third, the emotion expression aspect of the system. We show this from the perspective of an adult and child interacting and playing a small game, much like an infant and caregiver situation. We also discuss the importance of speaker recognition in terms of Human-Robot-Interaction and emotion, showing how the interaction process between a participant and ERWIN can allow the robot to prefer to attend to that person.</style></abstract></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>5</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">Lori Malatesta</style></author><author><style face="normal" font="default" size="100%">John C Murray</style></author><author><style face="normal" font="default" size="100%">Amaryllis Raouzaiou</style></author><author><style face="normal" font="default" size="100%">Antoine Hiolle</style></author><author><style face="normal" font="default" size="100%">Lola Cañamero</style></author><author><style face="normal" font="default" size="100%">Kostas Karpouzis</style></author></authors><secondary-authors><author><style face="normal" font="default" size="100%">Mario I. Chacon-M.</style></author></secondary-authors></contributors><titles><title><style face="normal" font="default" size="100%">Emotion Modelling and Facial Affect Recognition in Human-Computer and Human-Robot Interaction</style></title><secondary-title><style face="normal" font="default" size="100%">Affective Computing, Emotion Modelling, Synthesis and Recognition</style></secondary-title></titles><dates><year><style  face="normal" font="default" size="100%">2009</style></year></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://www.intechopen.com/books/state_of_the_art_in_face_recognition/emotion_modelling_and_facial_affect_recognition_in_human-computer_and_human-robot_interaction</style></url></web-urls></urls><publisher><style face="normal" font="default" size="100%">InTechOpen Publishers</style></publisher><isbn><style face="normal" font="default" size="100%">978-3-902613-42-4</style></isbn><language><style face="normal" font="default" size="100%">eng</style></language><section><style face="normal" font="default" size="100%">12</style></section></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>47</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">O'Bryne, Claire</style></author><author><style face="normal" font="default" size="100%">Lola Cañamero</style></author><author><style face="normal" font="default" size="100%">John C Murray</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">The Importance of the Body in Affect-Modulated Action Selection: A Case Study Comparing Proximal Versus Distal Perception in a Prey-Predator Scenario</style></title><secondary-title><style face="normal" font="default" size="100%">Proc. 3rd Intl. Conference on Affective Computing and Intelligent Interaction (ACII 2009)</style></secondary-title></titles><dates><year><style  face="normal" font="default" size="100%">2009</style></year><pub-dates><date><style  face="normal" font="default" size="100%">09/2009</style></date></pub-dates></dates><publisher><style face="normal" font="default" size="100%">IEEE Press</style></publisher><pub-location><style face="normal" font="default" size="100%">Amsterdam, The Netherlands</style></pub-location><pages><style face="normal" font="default" size="100%">1–6</style></pages><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">In the context of the animat approach, we investigate the effect of an emotion-like hormonal mechanism, as a modulator of perception - and second order controller to an underlying motivation-based action selection architecture - on brain-body-environment interactions within a prey-predator scenario. We are particularly interested in the effects that affective modulation of different perceptual capabilities has on the dynamics of interactions between predator and prey, as part of a broader study of the adaptive value of emotional states such as &quot;fear&quot; and &quot;aggression&quot; in the context of action selection. In this paper we present experiments where we modulated the architecture of a prey robot using two different types of sensory capabilities, proximal and distal, effectively creating combinations of different prey &quot;brains&quot; and &quot;bodies&quot;.</style></abstract></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>47</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">John C Murray</style></author><author><style face="normal" font="default" size="100%">Lola Cañamero</style></author><author><style face="normal" font="default" size="100%">Kim A. Bard</style></author><author><style face="normal" font="default" size="100%">Ross, Marina Davila</style></author><author><style face="normal" font="default" size="100%">Thorsteinsson, Kate</style></author></authors><secondary-authors><author><style face="normal" font="default" size="100%">Kim, Jong-Hwan</style></author><author><style face="normal" font="default" size="100%">Ge, Shuzhi Sam</style></author><author><style face="normal" font="default" size="100%">Vadakkepat, Prahlad</style></author><author><style face="normal" font="default" size="100%">Jesse, Norbert</style></author><author><style face="normal" font="default" size="100%">Al Manum, Abdullah</style></author><author><style face="normal" font="default" size="100%">Puthusserypady K, Sadasivan</style></author><author><style face="normal" font="default" size="100%">Rückert, Ulrich</style></author><author><style face="normal" font="default" size="100%">Sitte, Joaquin</style></author><author><style face="normal" font="default" size="100%">Witkowski, Ulf</style></author><author><style face="normal" font="default" size="100%">Nakatsu, Ryohei</style></author><author><style face="normal" font="default" size="100%">Braunl, Thomas</style></author><author><style face="normal" font="default" size="100%">Baltes, Jacky</style></author><author><style face="normal" font="default" size="100%">Anderson, John</style></author><author><style face="normal" font="default" size="100%">Wong, Ching-Chang</style></author><author><style face="normal" font="default" size="100%">Verner, Igor</style></author><author><style face="normal" font="default" size="100%">Ahlgren, David</style></author></secondary-authors></contributors><titles><title><style face="normal" font="default" size="100%">The Influence of Social Interaction on the Perception of Emotional Expression: A Case Study with a Robot Head</style></title><secondary-title><style face="normal" font="default" size="100%">Advances in Robotics: Proc. FIRA RoboWorld Congress 2009</style></secondary-title><tertiary-title><style face="normal" font="default" size="100%">Lecture Notes in Computer Science</style></tertiary-title></titles><dates><year><style  face="normal" font="default" size="100%">2009</style></year><pub-dates><date><style  face="normal" font="default" size="100%">08/2009</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">https://link.springer.com/chapter/10.1007%2F978-3-642-03983-6_10</style></url></web-urls></urls><publisher><style face="normal" font="default" size="100%">Springer Berlin Heidelberg</style></publisher><pub-location><style face="normal" font="default" size="100%">Incheon, Korea</style></pub-location><volume><style face="normal" font="default" size="100%">5744</style></volume><pages><style face="normal" font="default" size="100%">63–72</style></pages><isbn><style face="normal" font="default" size="100%">978-3-642-03983-6</style></isbn><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">In this paper we focus primarily on the influence that socio-emotional interaction has on the perception of emotional expression by a robot. We also investigate and discuss the importance of emotion expression in socially interactive situations involving human robot interaction (HRI), and show the importance of utilising emotion expression when dealing with interactive robots, that are to learn and develop in socially situated environments. We discuss early expressional development and the function of emotion in communication in humans and how this can improve HRI communications. Finally we provide experimental results showing how emotion-rich interaction via emotion expression can affect the HRI process by providing additional information.</style></abstract></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>47</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">John C Murray</style></author><author><style face="normal" font="default" size="100%">Lola Cañamero</style></author><author><style face="normal" font="default" size="100%">Antoine Hiolle</style></author></authors></contributors><titles><title><style face="normal" font="default" size="100%">Towards a Model of Emotion Expression in an Interactive Robot Head</style></title><secondary-title><style face="normal" font="default" size="100%">Proc. 18th IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2009)</style></secondary-title></titles><dates><year><style  face="normal" font="default" size="100%">2009</style></year><pub-dates><date><style  face="normal" font="default" size="100%">09/2009</style></date></pub-dates></dates><publisher><style face="normal" font="default" size="100%">IEEE Press</style></publisher><pub-location><style face="normal" font="default" size="100%">Toyama, Japan</style></pub-location><pages><style face="normal" font="default" size="100%">627–632</style></pages><isbn><style face="normal" font="default" size="100%">978-1-4244-5081-7</style></isbn><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">In this paper we present a robotic head designed for interaction with humans, endowed with mechanisms to make the robot respond to social interaction with emotional expressions, allowing the emotional expression of the robot to be directly influenced by the social interaction process. We look into how emotionally expressive visual feedback from the robot can enrich the interaction process and provide the participant with additional information regarding the interaction, allowing the user to better understand the intentions of the robot. We discuss some of the interactions that are possible with ERWIN and how this can effect the response of the system. We show experimental scenarios where the interaction processes influences the emotional expressions and how the participants interpret this. We draw our conclusions from the feedback from experiments, showing that indeed emotional expression can have an influence on the social interaction between a robot and human.</style></abstract></record><record><source-app name="Biblio" version="7.x">Drupal-Biblio</source-app><ref-type>47</ref-type><contributors><authors><author><style face="normal" font="default" size="100%">John C Murray</style></author><author><style face="normal" font="default" size="100%">Lola Cañamero</style></author></authors><secondary-authors><author><style face="normal" font="default" size="100%">Robert Lowe</style></author><author><style face="normal" font="default" size="100%">Morse, A</style></author><author><style face="normal" font="default" size="100%">Ziemke, T</style></author></secondary-authors></contributors><titles><title><style face="normal" font="default" size="100%">Towards a Hormone-Modulated Model for Emotion Expression in a Socially Interactive Robot Head</style></title><secondary-title><style face="normal" font="default" size="100%">Workshop &quot;The role of Emotion in Adaptive Behavior and Cognitive Robotics&quot; held in conjunction with 10th International Conference on Simulation of Adaptive Behavior (SAB 2008)</style></secondary-title></titles><dates><year><style  face="normal" font="default" size="100%">2008</style></year><pub-dates><date><style  face="normal" font="default" size="100%">07/2008</style></date></pub-dates></dates><urls><web-urls><url><style face="normal" font="default" size="100%">http://image.ece.ntua.gr/projects/feelix/system/files/Murray_SAB_final-1.pdf</style></url></web-urls></urls><pub-location><style face="normal" font="default" size="100%">Osaka, Japan</style></pub-location><language><style face="normal" font="default" size="100%">eng</style></language><abstract><style face="normal" font="default" size="100%">In this paper we present a robot head ERWIN capable of human-robot interaction, endowed with interactive mechanisms for allowing the emotional state and expression of the robot to be directly influenced by the social interaction process. Allowing the interaction process to influence the expression of the robot head can in turn influence the way the user interacts with the robot, in addition to allowing the user to better understand the intentions of the robot during this process. We discuss some of the interactions that are possible with ERWIN and how this can affect the response of the system. We show an example scenario where the interaction process makes the robot go through several different emotions.</style></abstract></record></records></xml>