Selective visual attention during public speaking in an immersive context

0Citations
Citations of this article
32Readers
Mendeley users who have this article in their library.

Abstract

It has recently become feasible to study selective visual attention to social cues in increasingly ecologically valid ways. In this secondary analysis, we examined gaze behavior in response to the actions of others in a social context. Participants (N = 84) were asked to give a 5-minute speech to a five-member audience that had been filmed in 360° video, displayed in a virtual reality headset containing a built-in eye tracker. Audience members were coached to make movements that would indicate interest or lack of interest (e.g., nodding vs. looking away). The goal of this paper was to analyze whether these actions influenced the speaker’s gaze. We found that participants showed reliable evidence of gaze towards audience member actions in general, and towards audience member actions involving their phone specifically (compared with other actions like looking away or leaning back). However, there were no differences in gaze towards actions reflecting interest (like nodding) compared with actions reflecting lack of interest (like looking away). Participants were more likely to look away from audience member actions as well, but there were no specific actions that elicited looking away more or less. Taken together, these findings suggest that the actions of audience members are broadly influential in motivating gaze behaviors in a realistic, contextually embedded (public speaking) setting. Further research is needed to examine the ways in which these findings can be elucidated in more controlled laboratory environments as well as in the real world.

References Powered by Scopus

Illusion and Well-Being: A Social Psychological Perspective on Mental Health

5581Citations
N/AReaders
Get full text

Threat-related attentional bias in anxious and nonanxious individuals: A meta-analytic study

2952Citations
N/AReaders
Get full text

OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields

2605Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Rubin, M., Guo, S., Muller, K., Zhang, R., Telch, M. J., & Hayhoe, M. M. (2022). Selective visual attention during public speaking in an immersive context. Attention, Perception, and Psychophysics, 84(2), 396–407. https://doi.org/10.3758/s13414-021-02430-x

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 4

50%

Professor / Associate Prof. 3

38%

Lecturer / Post doc 1

13%

Readers' Discipline

Tooltip

Psychology 4

44%

Computer Science 2

22%

Engineering 2

22%

Sports and Recreations 1

11%

Save time finding and organizing research with Mendeley

Sign up for free