In bot we trust? Personality traits and reciprocity in human-bot trust games

  • Upadhyaya N
  • Galizzi M
N/ACitations
Citations of this article
3Readers
Mendeley users who have this article in their library.
Get full text

Abstract

People are increasingly interacting with forms of artificial intelligence (AI). It is crucial to understand whether accepted evidence for human-human reciprocity holds true for human-bot interactions. In a pre-registered online experiment ( N = 539) we first replicate recent studies, finding that the identity of a player's counterpart in a one-shot binary Trust Game has a significant effect on the rate of reciprocity, with bot counterparts receiving lower levels of returned amounts than human counterparts. We then explore whether individual differences in a player's personality traits—in particular Agreeableness, Extraversion, Honesty-Humility and Openness —moderate the effect of the identity of the player's counterpart on the rate of reciprocity. In line with literature on human-human interactions, participants exhibiting higher levels of Honesty-Humility , and to a lesser extent Agreeableness , are found to reciprocate more, regardless of the identity of their counterpart. No personality trait, however, moderates the effect of interacting with a bot. Finally, we consider whether general attitudes to AI affect the reciprocity but find no significant relationship.

Cite

CITATION STYLE

APA

Upadhyaya, N., & Galizzi, M. M. (2023). In bot we trust? Personality traits and reciprocity in human-bot trust games. Frontiers in Behavioral Economics, 2. https://doi.org/10.3389/frbhe.2023.1164259

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free