Article published In:
Social Cues in Robot Interaction, Trust and Acceptance
Edited by Alessandra Rossi, Kheng Lee Koay, Silvia Moros, Patrick Holthaus and Marcus Scheunemann
[Interaction Studies 20:3] 2019
► pp. 487508
References
App, B., McIntosh, D. N., Reed, C. L., and Hertenstein, M. J.
(2011) Nonverbal channel use in communication of emotion: How may depend on why. Emotion, 11(3):603–617. DOI logoGoogle Scholar
App, B., Reed, C. L., and McIntosh, D. N.
(2012) Relative contributions of face and body configurations: Perceiving emotional state and motion intention. Cognition and Emotion, 26(4):690–698. DOI logoGoogle Scholar
Bartneck, C., Reichenbach, J., and Van Breemen, A.
(2004) In your face, robot! the influence of a characters embodiment on how users perceive its emotional expressions. In Design and Emotion.Google Scholar
Beck, A., Cañamero, L., Hiolle, A., Damiano, L., Cosi, P., Tesser, F., and Sommavilla, G.
(2013) Interpretation of emotional body language displayed by a humanoid robot: A case study with children. International Journal of Social Robotics, 5(3):325–334. DOI logoGoogle Scholar
Beck, A., Hiolle, A., Mazel, A., and Cañamero, L.
(2010) Interpretation of emotional body language displayed by robots. In Proceedings of the 3rd International Workshop on Affective Interaction in Natural Environments, AFFINE ’10, pages 37–42, New York, NY, USA. ACM.Google Scholar
Biele, C. and Grabowska, A.
(2006) Sex differences in perception of emotion intensity in dynamic and static facial expressions. Experimental Brain Research, 171(1):1–6. DOI logoGoogle Scholar
Breazeal, C.
(2001) Emotive qualities in robot speech. In Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems, volume 31, pages 1388–1394.Google Scholar
Breazeal, C. and Scassellati, B.
(1999) How to build robots that make friends and influence people. In Proceedings 1999 IEEE/RSJ International Conference on Intelligent Robots and Systems. Human and Environment Friendly Robots with High Intelligence and Emotional Quotients (Cat. No.99CH36289), volume 21, pages 858–863 vol.21.Google Scholar
Burattini, E. and Rossi, S.
(2010) Periodic activations of behaviours and emotional adaptation in behaviour-based robotics. Connect. Sci, 22(3):197–213. DOI logoGoogle Scholar
Calder, A. J., Keane, J., Manly, T., Sprengelmeyer, R., Scott, S., Nimmo-Smith, I., and Young, A. W.
(2003) Facial expression recognition across the adult life span. Neuropsychologia, 41(2):195–202. The cognitive neuroscience of social behavior. DOI logoGoogle Scholar
Calvo, M. G. and Nummenmaa, L.
(2016) Perceptual and affective mechanisms in facial expression recognition: An integrative review. Cognition and Emotion, 30(6):1081–1106. DOI logoGoogle Scholar
Conti, D., Carla, C., Di Nuovo, S., et al.
(2019) robot, tell me a tale!: A social robot as tool for teachers in kindergarten. Interaction Studies, 20(2):1–16.Google Scholar
Ekman, P.
(1992) An argument for basic emotions. Cognition & emotion, 6(3–4):169–200. DOI logoGoogle Scholar
Haring, M., Bee, N., and Andr, E.
(2011) Creation and evaluation of emotion expression with body movement, sound and eye color for humanoid robots. In RO-MAN, pages 204–209.Google Scholar
Jack, R. E., Garrod, O. G., and Schyns, P. G.
(2014) Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time. Current Biology, 24(2):187–192. DOI logoGoogle Scholar
Kleinsmith, A. and Bianchi-Berthouze, N.
(2013) Affective body expression perception and recognition: A survey. IEEE Transactions on Affective Computing, 4(1):15–33. DOI logoGoogle Scholar
Leite, I.
(2015) Long-term interactions with empathic social robots. AI Matters, 1(3):13–15. DOI logoGoogle Scholar
Li, X., MacDonald, B., and Watson, C. I.
(2009) Expressive facial speech synthesis on a robotic platform. In Proceedings of the 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems, IROS’09, pages 5009–5014. IEEE Press. DOI logoGoogle Scholar
Lim, A., Ogata, T., and Okuno, H. G.
(2012) The desire model: Cross-modal emotion analysis and expression for robots. DOI logoGoogle Scholar
Marmpena, M., Lim, A., and Dahl, T. S.
(2017) How does the robot feel? annotation of emotional expressions generated by a humanoid robot with affective quantifiers. In Proceedings of the 2017 Workshop on Behavior Adaptation, Interaction and Learning for Assistive Robotics (BAILAR – IEEE RO-MAN2017).Google Scholar
(2018) How does the robot feel? perception of valence and arousal in emotional body language. Paladyn, Journal of Behavioral Robotics, 9(1):168–182. DOI logoGoogle Scholar
McColl, D. and Nejat, G.
(2014) Recognizing emotional body language displayed by a humanlike social robot. International Journal of Social Robotics, 6(2):261–280. DOI logoGoogle Scholar
Moltchanova, E. and Bartneck, C.
Mutlu, B., Yamaoka, F., Kanda, T., Ishiguro, H., and Hagita, N.
(2009) Nonverbal leakage in robots: Communication of intentions through seemingly unintentional behavior. In 4th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pages 69–76.Google Scholar
Nijdam, N. A.
(2009) Mapping emotion to color. Book Mapping emotion to color, pages 2–9.Google Scholar
Ortony, A., Clore, G. L., and Collins, A.
(1990) The cognitive structure of emotions. Cambridge university press.Google Scholar
Pereira, A., Leite, I., Mascarenhas, S., Martinho, C., and Paiva, A.
(2011) Using empathy to improve human-robot relationships. In Human-Robot Personal Relationships, pages 130–138, Berlin, Heidelberg. Springer Berlin Heidelberg. DOI logoGoogle Scholar
Rosenthal-von der Pütten, A. M., Krämer, N. C., and Herrmann, J.
(2018) The effects of humanlike and robot-specific affective nonverbal behavior on perception, emotion, and behavior. International Journal of Social Robotics, 10(5):569–582. DOI logoGoogle Scholar
Rossi, S., Ferland, F., and Tapus, A.
(2017) User profiling and behavioral adaptation for hri: A survey. Pattern Recognition Letters, 991(Supplement C):3–12. User Profiling and Behavior Adaptation for Human-Robot Interaction. DOI logoGoogle Scholar
Rossi, S., Staffa, M., and Tamburro, A.
(2018) Socially assistive robot for providing recommendations: Comparing a humanoid robot with a mobile application. International Journal of Social Robotics, 10(2):265–278. DOI logoGoogle Scholar
Russell, J. A.
(1980) A circumplex model of affect. Journal of Personality and Social Psychology, 39(6):1161–1178. DOI logoGoogle Scholar
Salem, M., Eyssel, F., Rohlfing, K., Kopp, S., and Joublin, F.
(2011) Effects of gesture on the perception of psychological anthropomorphism: A case study with a humanoid robot. In Social Robotics, pages 31–41, Berlin, Heidelberg. Springer Berlin Heidelberg. DOI logoGoogle Scholar
Scherer, K. R., Schorr, A., and Johnstone, T.
(2001) Appraisal processes in emotion: Theory, methods, research. Oxford University Press.Google Scholar
Schlosberg, H.
(1954) Three dimensions of emotion. Psychological review, 61(2):81. DOI logoGoogle Scholar
Song, S. and Yamada, S.
(2017a) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, HRI ’17, pages 2–11, New York, NY, USA. ACM.Google Scholar
(2017b) Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. In Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction, pages 2–11. ACM. DOI logoGoogle Scholar
Tonks, J., Williams, W. H., Frampton, I., Yates, P., and Slater, A.
(2007) Assessing emotion recognition in 915-years olds: preliminary analysis of abilities in reading emotion from faces, voices and eyes. Brain Injury, 21(6):623–629. DOI logoGoogle Scholar
Tsiourti, C., Weiss, A., Wac, K., and Vincze, M.
(2017) Designing emotionally expressive robots: A comparative study on the perception of communication modalities. In Proceedings of the 5th International Conference on Human Agent Interaction, HAI ’17, pages 213–222, New York, NY, USA. ACM. DOI logoGoogle Scholar
Valdez, P. and Mehrabian, A.
(1994) Effects of color on emotions. Journal of experimental psychology: General, 123(4):394. DOI logoGoogle Scholar
Wilhelm, O., Hildebrandt, A., Manske, K., Schacht, A., and Sommer, W.
(2014) Test battery for measuring the perception and recognition of facial expressions of emotion. Frontiers in Psychology, 5:404.Google Scholar
Wundt, W. M.
(1907) Outlines of psychology. W. Engelmann.Google Scholar
Xu, J., Broekens, J., Hindriks, K., and Neerincx, M. A.
(2014) Robot mood is contagious: Effects of robot body language in the imitation game. In Proceedings of the 2014 International Conference on Autonomous Agents and Multi-agent Systems, AAMAS ’14, pages 973–980, Richland, SC. International Foundation for Autonomous Agents and Multiagent Systems.Google Scholar
Cited by

Cited by 12 other publications

Bi, Wei, Yongzhen Xie, Zheng Dong & Hongshen Li
2022. Enterprise Strategic Management From the Perspective of Business Ecosystem Construction Based on Multimodal Emotion Recognition. Frontiers in Psychology 13 DOI logo
Fiorini, Laura, Grazia D'Onofrio, Alessandra Sorrentino, Federica Gabriella Cornacchia Loizzo, Sergio Russo, Filomena Ciccone, Francesco Giuliani, Daniele Sancarlo & Filippo Cavallo
2024. The Role of Coherent Robot Behavior and Embodiment in Emotion Perception and Recognition During Human-Robot Interaction: Experimental Study. JMIR Human Factors 11  pp. e45494 ff. DOI logo
Giang, Christian, Loredana Addimando, Luca Botturi, Lucio Negrini, Alessandro Giusti & Alberto Piatti
2023. Have You Ever Seen a Robot? An Analysis of Children’s Drawings Between Technology and Science Fiction. Journal for STEM Education Research 6:2  pp. 232 ff. DOI logo
Lambiase, Paolo Domenico, Alessandra Rossi & Silvia Rossi
2023. A Two-Tier GAN Architecture for Conditioned Expressions Synthesis on Categorical Emotions. International Journal of Social Robotics DOI logo
Liu, Dong, Zhiyong Wang, Lifeng Wang & Longxi Chen
2021. Multi-Modal Fusion Emotion Recognition Method of Speech Expression Based on Deep Learning. Frontiers in Neurorobotics 15 DOI logo
Rossi, Alessandra, Marcus M. Scheunemann, Gianluca L’Arco & Silvia Rossi
2021. Evaluation of a Humanoid Robot’s Emotional Gestures for Transparent Interaction. In Social Robotics [Lecture Notes in Computer Science, 13086],  pp. 397 ff. DOI logo
Rossi, Silvia, Teresa Cimmino, Marco Matarese & Mario Raiano
2019. 2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN),  pp. 1 ff. DOI logo
Rossi, Silvia, Elena Dell’Aquila & Benedetta Bucci
2019. Evaluating the Emotional Valence of Affective Sounds for Child-Robot Interaction. In Social Robotics [Lecture Notes in Computer Science, 11876],  pp. 505 ff. DOI logo
Rossi, Silvia, Marwa Larafa & Martina Ruocco
2020. Emotional and Behavioural Distraction by a Social Robot for Children Anxiety Reduction During Vaccination. International Journal of Social Robotics 12:3  pp. 765 ff. DOI logo
Spezialetti, Matteo, Giuseppe Placidi & Silvia Rossi
2020. Emotion Recognition for Human-Robot Interaction: Recent Advances and Future Perspectives. Frontiers in Robotics and AI 7 DOI logo
Vigni, Francesco, Alessandra Rossi, Linda Miccio & Silvia Rossi
2022. On the Emotional Transparency of a Non-humanoid Social Robot. In Social Robotics [Lecture Notes in Computer Science, 13817],  pp. 290 ff. DOI logo
Xue, Junting, Yanqun Huang, Xu Li, Jutao Li, Peng Zhang & Zhiyu Kang
2022. Emotional Influence of Pupillary Changes of Robots with Different Human-Likeness Levels on Human. International Journal of Social Robotics 14:7  pp. 1687 ff. DOI logo

This list is based on CrossRef data as of 1 may 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers. Any errors therein should be reported to them.