%0 Journal Article %J Frontiers in Psychology %D 2022 %T Toward Affective Interactions: E-Motions and Embodied Artificial Cognitive Systems %A Scarinzi, A. %A Cañamero, L. %B Frontiers in Psychology %I Frontiers in Psychology %V 13 %P 1 - 2 %8 04/2022 %G eng %U https://doi.org/10.3389/fpsyg.2022.768416 %N article 768416 %9 Opinion article %R 10.3389/fpsyg.2022.768416 %0 Conference Paper %B TSAR 2021: RO-MAN 2021 Workshop on Robot Behavior Adaptation to Human Social Norms %D 2021 %T Towards an Affective Model of Norm Emergence and Adaptation %A Stavros Anagnou %A Lola Cañamero %B TSAR 2021: RO-MAN 2021 Workshop on Robot Behavior Adaptation to Human Social Norms %8 08/2021 %G eng %U https://tsar2021.ai.vub.ac.be/uploads/papers/TSAR_2021_paper_10.pdf %0 Journal Article %J Journal of Human-Robot Interaction %D 2016 %T Towards Long-Term Social Child-Robot Interaction: Using Multi-Activity Switching to Engage Young Users %A Coninx, Alexandre %A Paul E. Baxter %A Oleari, Elettra %A Bellini, Sara %A Bierman, Bert %A Henkemans, Olivier Blanson %A Lola Cañamero %A Cosi, Piero %A Valentin Enescu %A Espinoza, Raquel Ros %A Antoine Hiolle %A Remi Humbert %A Kiefer, Bernd %A Kruijff-Korbayová, Ivana %A Looije, Rosmarijn %A Mosconi, Marco %A Mark A. Neerincx %A Giulio Paci %A Patsis, Georgios %A Pozzi, Clara %A Sacchitelli, Francesca %A Hichem Sahli %A Alberto Sanna %A Sommavilla, Giacomo %A Tesser, Fabio %A Yiannis Demiris %A Tony Belpaeme %X Social robots have the potential to provide support in a number of practical domains, such as learning and behaviour change. This potential is particularly relevant for children, who have proven receptive to interactions with social robots. To reach learning and therapeutic goals, a number of issues need to be investigated, notably the design of an effective child-robot interaction (cHRI) to ensure the child remains engaged in the relationship and that educational goals are met. Typically, current cHRI research experiments focus on a single type of interaction activity (e.g. a game). However, these can suffer from a lack of adaptation to the child, or from an increasingly repetitive nature of the activity and interaction. In this paper, we motivate and propose a practicable solution to this issue: an adaptive robot able to switch between multiple activities within single interactions. We describe a system that embodies this idea, and present a case study in which diabetic children collaboratively learn with the robot about various aspects of managing their condition. We demonstrate the ability of our system to induce a varied interaction and show the potential of this approach both as an educational tool and as a research method for long-term cHRI. %B Journal of Human-Robot Interaction %V 5 %P 32–67 %G eng %U https://dl.acm.org/doi/abs/10.5898/JHRI.5.1.Coninx %N 1 %R 10.5898/JHRI.5.1.Coninx %0 Conference Paper %B Proc. 19th Annual IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2010) %D 2010 %T Towards an Affect Space for Robots to Display Emotional Body Language %A Aryel Beck %A Lola Cañamero %A Kim A. Bard %X In order for robots to be socially accepted and generate empathy it is necessary that they display rich emotions. For robots such as Nao, body language is the best medium available given their inability to convey facial expressions. Displaying emotional body language that can be interpreted whilst interacting with the robot should significantly improve its sociability. This research investigates the creation of an Affect Space for the generation of emotional body language to be displayed by robots. To create an Affect Space for body language, one has to establish the contribution of the different positions of the joints to the emotional expression. The experiment reported in this paper investigated the effect of varying a robot's head position on the interpretation, Valence, Arousal and Stance of emotional key poses. It was found that participants were better than chance level in interpreting the key poses. This finding confirms that body language is an appropriate medium for robot to express emotions. Moreover, the results of this study support the conclusion that Head Position is an important body posture variable. Head Position up increased correct identification for some emotion displays (pride, happiness, and excitement), whereas Head Position down increased correct identification for other displays (anger, sadness). Fear, however, was identified well regardless of Head Position. Head up was always evaluated as more highly Aroused than Head straight or down. Evaluations of Valence (degree of negativity to positivity) and Stance (degree to which the robot was aversive to approaching), however, depended on both Head Position and the emotion displayed. The effects of varying this single body posture variable were complex. %B Proc. 19th Annual IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2010) %I IEEE %C Viareggio, Italy %P 464–469 %@ 978-1-4244-7991-7 %G eng %R 10.1109/ROMAN.2010.5598649 %0 Conference Paper %B Proc. 18th IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2009) %D 2009 %T Towards a Model of Emotion Expression in an Interactive Robot Head %A John C Murray %A Lola Cañamero %A Antoine Hiolle %X In this paper we present a robotic head designed for interaction with humans, endowed with mechanisms to make the robot respond to social interaction with emotional expressions, allowing the emotional expression of the robot to be directly influenced by the social interaction process. We look into how emotionally expressive visual feedback from the robot can enrich the interaction process and provide the participant with additional information regarding the interaction, allowing the user to better understand the intentions of the robot. We discuss some of the interactions that are possible with ERWIN and how this can effect the response of the system. We show experimental scenarios where the interaction processes influences the emotional expressions and how the participants interpret this. We draw our conclusions from the feedback from experiments, showing that indeed emotional expression can have an influence on the social interaction between a robot and human. %B Proc. 18th IEEE International Symposium on Robot and Human Interactive Communication (IEEE RO-MAN 2009) %I IEEE Press %C Toyama, Japan %P 627–632 %8 09/2009 %@ 978-1-4244-5081-7 %G eng %R 10.1109/ROMAN.2009.5326131 %0 Conference Paper %B Workshop "The role of Emotion in Adaptive Behavior and Cognitive Robotics" held in conjunction with 10th International Conference on Simulation of Adaptive Behavior (SAB 2008) %D 2008 %T Towards a Hormone-Modulated Model for Emotion Expression in a Socially Interactive Robot Head %A John C Murray %A Lola Cañamero %E Robert Lowe %E Morse, A %E Ziemke, T %X In this paper we present a robot head ERWIN capable of human-robot interaction, endowed with interactive mechanisms for allowing the emotional state and expression of the robot to be directly influenced by the social interaction process. Allowing the interaction process to influence the expression of the robot head can in turn influence the way the user interacts with the robot, in addition to allowing the user to better understand the intentions of the robot during this process. We discuss some of the interactions that are possible with ERWIN and how this can affect the response of the system. We show an example scenario where the interaction process makes the robot go through several different emotions. %B Workshop "The role of Emotion in Adaptive Behavior and Cognitive Robotics" held in conjunction with 10th International Conference on Simulation of Adaptive Behavior (SAB 2008) %C Osaka, Japan %8 07/2008 %G eng %U http://image.ece.ntua.gr/projects/feelix/system/files/Murray_SAB_final-1.pdf