Revisiting the political biases of ChatGPT

6Citations
Citations of this article
19Readers
Mendeley users who have this article in their library.
Get full text

Abstract

Although ChatGPT promises wide-ranging applications, there is a concern that it is politically biased; in particular, that it has a left-libertarian orientation. Nevertheless, following recent trends in attempts to reduce such biases, this study re-evaluated the political biases of ChatGPT using political orientation tests and the application programming interface. The effects of the languages used in the system as well as gender and race settings were evaluated. The results indicate that ChatGPT manifests less political bias than previously assumed; however, they did not entirely dismiss the political bias. The languages used in the system, and the gender and race settings may induce political biases. These findings enhance our understanding of the political biases of ChatGPT and may be useful for bias evaluation and designing the operational strategy of ChatGPT.

Cite

CITATION STYLE

APA

Fujimoto, S., & Takemoto, K. (2023). Revisiting the political biases of ChatGPT. Frontiers in Artificial Intelligence, 6. https://doi.org/10.3389/frai.2023.1232003

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free