Alcaraz Carrión, Daniel & Javier Valenzuela
2021.
Distant time, distant gesture: speech and gesture correlate to express temporal distance.
Semiotica 2021:241
► pp. 159 ff.
Alibali, Martha W. & Autumn B. Hostetter
2010.
Mimicry and simulation in gesture comprehension.
Behavioral and Brain Sciences 33:6
► pp. 433 ff.
Arbona, Eléonore, Kilian G. Seeber & Marianne Gullberg
2023.
Semantically related gestures facilitate language comprehension during simultaneous interpreting.
Bilingualism: Language and Cognition 26:2
► pp. 425 ff.
Beattie, Geoffrey, Kate Webster & Jamie Ross
2010.
The Fixation and Processing of the Iconic Gestures That Accompany Talk.
Journal of Language and Social Psychology 29:2
► pp. 194 ff.
Becvar, Amaya, James Hollan & Edwin Hutchins
2008.
Representational Gestures as Cognitive Artifacts for Developing Theories in a Scientific Laboratory. In
Resources, Co-Evolution and Artifacts [
Computer Supported Cooperative Work, ],
► pp. 117 ff.
Chafai, Nicolas Ech, Catherine Pelachaud, Danielle Pelé & Gaspard Breton
2006.
Gesture Expressivity Modulations in an ECA Application. In
Intelligent Virtual Agents [
Lecture Notes in Computer Science, 4133],
► pp. 181 ff.
Chafai, Nicolas Ech, Catherine Pelachaud & Danielle Pelé
2007.
A case study of gesture expressivity breaks.
Language Resources and Evaluation 41:3-4
► pp. 341 ff.
De Filippo, Carol Lee & Charissa R. Lansing
2006.
Eye Fixations of Deaf and Hearing Observers in Simultaneous Communication Perception.
Ear & Hearing 27:4
► pp. 331 ff.
Debreslioska, Sandra, Joost van de Weijer & Marianne Gullberg
2019.
Addressees Are Sensitive to the Presence of Gesture When Tracking a Single Referent in Discourse.
Frontiers in Psychology 10
Dewhurst, Richard, Marcus Nyström, Halszka Jarodzka, Tom Foulsham, Roger Johansson & Kenneth Holmqvist
2012.
It depends on how you look at it: Scanpath comparison in multiple dimensions with MultiMatch, a vector-based approach.
Behavior Research Methods 44:4
► pp. 1079 ff.
Drijvers, Linda, Ole Jensen & Eelke Spaak
2021.
Rapid invisible frequency tagging reveals nonlinear integration of auditory and visual information.
Human Brain Mapping 42:4
► pp. 1138 ff.
Drijvers, Linda, Julija Vaitonytė & Asli Özyürek
2019.
Degree of Language Experience Modulates Visual Attention to Visible Speech and Iconic Gestures During Clear and Degraded Speech Comprehension.
Cognitive Science 43:10
Ech Chafai, Nicolas, Magalie Ochs, Christopher Peters, Maurizio Mancini, Elisabetta Bevacqua & Catherine Pelachaud
2007.
Proceedings of the 19th Conference on l'Interaction Homme-Machine,
► pp. 207 ff.
Eggenberger, Noëmi, Basil C. Preisig, Rahel Schumacher, Simone Hopfner, Tim Vanbellingen, Thomas Nyffeler, Klemens Gutbrod, Jean-Marie Annoni, Stephan Bohlhalter, Dario Cazzoli, René M. Müri & Antoni Rodriguez-Fornells
2016.
Comprehension of Co-Speech Gestures in Aphasic Patients: An Eye Movement Study.
PLOS ONE 11:1
► pp. e0146583 ff.
Fedorova, O.V. & I.Y. Zherdev
2019.
Follow the hands of the interlocutor! (on strategies for the distribution of visual attention).
Experimental Psychology (Russia) 12:1
► pp. 98 ff.
Fussell, Susan R., Leslie D. Setlock, Jie Yang, Jiazhi Ou, Elizabeth Mauer & Adam D. I. Kramer
2004.
Gestures Over Video Streams to Support Remote Collaboration on Physical Tasks.
Human–Computer Interaction 19:3
► pp. 273 ff.
Gullberg, Marianne
2003.
Eye Movements and Gestures in Human Face-to-face Interaction. In
The Mind's Eye,
► pp. 685 ff.
Gullberg, Marianne & Kenneth Holmqvist
2002.
Visual Attention towards Gestures in Face-to-Face Interaction vs. on Screen. In
Gesture and Sign Language in Human-Computer Interaction [
Lecture Notes in Computer Science, 2298],
► pp. 206 ff.
Gullberg, Marianne & Sotaro Kita
2009.
Attention to Speech-Accompanying Gestures: Eye Movements and Information Uptake.
Journal of Nonverbal Behavior 33:4
► pp. 251 ff.
Gurney, Daniel J., Louise R. Ellis & Emily Vardon-Hynard
2016.
The saliency of gestural misinformation in the perception of a violent crime.
Psychology, Crime & Law 22:7
► pp. 651 ff.
Gurney, Daniel J., Karen J. Pine & Richard Wiseman
2013.
The Gestural Misinformation Effect: Skewing Eyewitness Testimony Through Gesture.
The American Journal of Psychology 126:3
► pp. 301 ff.
Hessels, Roy S.
2020.
How does gaze to faces support face-to-face interaction? A review and perspective.
Psychonomic Bulletin & Review 27:5
► pp. 856 ff.
Hewig, Johannes, Ralf H. Trippe, Holger Hecht, Thomas Straube & Wolfgang H. R. Miltner
2008.
Gender Differences for Specific Body Regions When Looking at Men and Women.
Journal of Nonverbal Behavior 32:2
► pp. 67 ff.
Heyd-Metzuyanim, Einat, Eeva S. H. Haataja, Markku S. Hannula & Enrique Garcia Moreno-Esteva
2023.
What can eye-tracking, combined with discourse analysis, teach us about the ineffectiveness of a group of students solving a geometric problem?.
Instructional Science 51:3
► pp. 363 ff.
Holler, Judith
2022.
Visual bodily signals as core devices for coordinating minds in interaction.
Philosophical Transactions of the Royal Society B: Biological Sciences 377:1859
Holler, Judith & Katie Wilkin
2011.
An experimental investigation of how addressee feedback affects co-speech gestures accompanying speakers’ responses.
Journal of Pragmatics 43:14
► pp. 3522 ff.
Jokinen, Kristiina, Hirohisa Furukawa, Masafumi Nishida & Seiichi Yamamoto
2013.
Gaze and turn-taking behavior in casual conversational interactions.
ACM Transactions on Interactive Intelligent Systems 3:2
► pp. 1 ff.
Kamiya, Nobuhiro
2018.
The effect of learner age on the interpretation of the nonverbal behaviors of teachers and other students in identifying questions in the L2 classroom.
Language Teaching Research 22:1
► pp. 47 ff.
Kamiya, Nobuhiro
2019.
What Factors Affect Learners’ Ability to Interpret Nonverbal Behaviors in EFL Classrooms?.
Journal of Nonverbal Behavior 43:3
► pp. 283 ff.
Kandana Arachchige, Kendra Gimhani, Wivine Blekic, Isabelle Simoes Loureiro & Laurent Lefebvre
2021.
Covert Attention to Gestures Is Sufficient for Information Uptake.
Frontiers in Psychology 12
Kovářová, Dominika
2015.
Kinezika ve výuce cizím jazykům: přehledová studie.
Pedagogická orientace 25:3
► pp. 413 ff.
Kuhlen, Anna K., Alexia Galati & Susan E. Brennano
2012.
Gesturing integrates top-down and bottom-up information: Joint effects of speakers' expectations and addressees' feedback.
Language and Cognition 4:1
► pp. 17 ff.
Liu, Tingting & Vahid Aryadoust
2024.
Does modality matter? A meta-analysis of the effect of video input in L2 listening assessment.
System 120
► pp. 103191 ff.
Mastrantuono, Eliana, Michele Burigo, Isabel R. Rodríguez-Ortiz & David Saldaña
2019.
The Role of Multiple Articulatory Channels of Sign-Supported Speech Revealed by Visual Processing.
Journal of Speech, Language, and Hearing Research 62:6
► pp. 1625 ff.
Mastrantuono, Eliana, David Saldaña & Isabel R. Rodríguez-Ortiz
2017.
An Eye Tracking Study on the Perception and Comprehension of Unimodal and Bimodal Linguistic Inputs by Deaf Adolescents.
Frontiers in Psychology 8
McDonough, Kim, Dustin Crowther, Paula Kielstra & Pavel Trofimovich
2015.
Exploring the potential relationship between eye gaze and English L2 speakers’ responses to recasts.
Second Language Research 31:4
► pp. 563 ff.
McDonough, Kim, Pavel Trofimovich, Libing Lu & Dato Abashidze
2019.
THE OCCURRENCE AND PERCEPTION OF LISTENER VISUAL CUES DURING NONUNDERSTANDING EPISODES.
Studies in Second Language Acquisition 41:5
► pp. 1151 ff.
Niu, Jin, Chih-Fu Wu, Xiao Dou & Kai-Chieh Lin
2022.
Designing Gestures of Robots in Specific Fields for Different Perceived Personality Traits.
Frontiers in Psychology 13
Olszanowski, Michal & Monika Wróbel
2024.
Why We Mimic Emotions Even When No One is Watching: Limited Visual Contact and Emotional Mimicry.
Emotion Review 16:1
► pp. 16 ff.
Preisig, Basil C., Noëmi Eggenberger, Dario Cazzoli, Thomas Nyffeler, Klemens Gutbrod, Jean-Marie Annoni, Jurka R. Meichtry, Tobias Nef & René M. Müri
2018.
Multimodal Communication in Aphasia: Perception and Production of Co-speech Gestures During Face-to-Face Conversation.
Frontiers in Human Neuroscience 12
Preisig, Basil C., Noëmi Eggenberger, Giuseppe Zito, Tim Vanbellingen, Rahel Schumacher, Simone Hopfner, Thomas Nyffeler, Klemens Gutbrod, Jean-Marie Annoni, Stephan Bohlhalter & René M. Müri
2015.
Perception of co-speech gestures in aphasic patients: A visual exploration study during the observation of dyadic conversations.
Cortex 64
► pp. 157 ff.
Rasmussen, Gitte & Elisabeth Dalby Kristiansen
2022.
The sociality of minimizing involvement in self-service shops in Denmark: Customers’ multi-modal practices of being, getting, and staying out of the way.
Discourse & Communication 16:2
► pp. 200 ff.
Rowbotham, Samantha, Donna M. Lloyd, Judith Holler & Alison Wearden
2015.
Externalizing the Private Experience of Pain: A Role for Co-Speech Gestures in Pain Communication?.
Health Communication 30:1
► pp. 70 ff.
Salem, Maha, Katharina Rohlfing, Stefan Kopp & Frank Joublin
2011.
2011 RO-MAN,
► pp. 247 ff.
Salminen-Saari, Jessica F. A., Enrique Garcia Moreno-Esteva, Eeva Haataja, Miika Toivanen, Markku S. Hannula & Anu Laine
2021.
Phases of collaborative mathematical problem solving and joint attention: a case study utilizing mobile gaze tracking.
ZDM – Mathematics Education 53:4
► pp. 771 ff.
Saunders, Emily & David Quinto-Pozos
2023.
Comprehension benefits of visual-gestural iconicity and spatial referencing.
Second Language Research 39:2
► pp. 363 ff.
Schreiter, Tim, Lucas Morillo-Mendez, Ravi T. Chadalavada, Andrey Rudenko, Erik Billing, Martin Magnusson, Kai O. Arras & Achim J. Lilienthal
2023.
2023 32nd IEEE International Conference on Robot and Human Interactive Communication (RO-MAN),
► pp. 293 ff.
Tellier, Marion, Gale Stam & Alain Ghio
THOMPSON, ROBIN L., KAREN EMMOREY & ROBERT KLUENDER
2009.
Learning to look: The acquisition of eye gaze agreement during the production of ASL verbs.
Bilingualism: Language and Cognition 12:4
► pp. 393 ff.
Treffner, Paul, Mira Peter & Mark Kleidon
2008.
Gestures and Phases: The Dynamics of Speech-Hand Communication.
Ecological Psychology 20:1
► pp. 32 ff.
Tsui, Katherine M. & Holly A. Yanco
2013.
Design Challenges and Guidelines for Social Interaction Using Mobile Telepresence Robots.
Reviews of Human Factors and Ergonomics 9:1
► pp. 227 ff.
Valtakari, Niilo V., Ignace T. C. Hooge, Charlotte Viktorsson, Pär Nyström, Terje Falck-Ytter & Roy S. Hessels
2021.
Eye tracking in human interaction: Possibilities and limitations.
Behavior Research Methods 53:4
► pp. 1592 ff.
Vanbellingen, Tim, Rahel Schumacher, Noëmi Eggenberger, Simone Hopfner, Dario Cazzoli, Basil C. Preisig, Manuel Bertschi, Thomas Nyffeler, Klemens Gutbrod, Claudio L. Bassetti, Stephan Bohlhalter & René M. Müri
2015.
Different visual exploration of tool-related gestures in left hemisphere brain damaged patients is associated with poor gestural imitation.
Neuropsychologia 71
► pp. 158 ff.
Varnosfadrani, Azizollah Dabaghi & Mahbube Tavakol
Wessler, Janet & Jochim Hansen
2017.
Temporal Closeness Promotes Imitation of Meaningful Gestures in Face-to-Face Communication.
Journal of Nonverbal Behavior 41:4
► pp. 415 ff.
Wisiecka, Katarzyna, Yuumi Konishi, Krzysztof Krejtz, Mahshid Zolfaghari, Birgit Kopainsky, Izabela Krejtz, Hideki Koike & Morten Fjeld
2023.
Supporting Complex Decision-Making: Evidence from an Eye Tracking Study on In-Person and Remote Collaboration.
ACM Transactions on Computer-Human Interaction 30:5
► pp. 1 ff.
Yücel, Zeynep, Francesco Zanlungo & Masahiro Shiomi
2017.
Walk the Talk: Gestures in Mobile Interaction. In
Social Robotics [
Lecture Notes in Computer Science, 10652],
► pp. 220 ff.
Zülch, Gert & Sascha Stowasser
2003.
Eye Tracking for Evaluating Industrial Human-Computer Interfaces. In
The Mind's Eye,
► pp. 531 ff.
Özer, Demet & Tilbe Göksun
2020.
Gesture Use and Processing: A Review on Individual Differences in Cognitive Resources.
Frontiers in Psychology 11
This list is based on CrossRef data as of 12 april 2024. Please note that it may not be complete. Sources presented here have been supplied by the respective publishers.
Any errors therein should be reported to them.