For a Greater Good: Bias Analysis in Writing Assessment

11Citations
Citations of this article
57Readers
Mendeley users who have this article in their library.

This article is free to access.

Abstract

Threats to construct validity should be reduced to a minimum. If true, sources of bias, namely raters, items, tests as well as gender, age, race, language background, culture, and socio-economic status need to be spotted and removed. This study investigates raters’ experience, language background, and the choice of essay prompt as potential sources of biases. Eight raters, four native English speakers and four Persian L1 speakers of English as a Foreign Language (EFL), scored 40 essays on one general and one field-specific topic. The raters assessed these essays based on Test of English as a Foreign Language (TOEFL) holistic and International English Language Testing System (IELTS) analytic band scores. Multifaceted Rasch Measurement (MFRM) was run to find extant biases. In spite of not finding statistically significant biases, several interesting results emerged illustrating the influence of construct-irrelevant factors such as raters’ experience, L1, and educational background. Further research is warranted to investigate these factors as potential sources of rater bias.

References Powered by Scopus

Validity of psychological assessment: Validation of inferences from persons' responses and performances as scientific inquiry into score meaning

2917Citations
N/AReaders
Get full text

Detecting Differential Item Functioning Using Logistic Regression Procedures

761Citations
N/AReaders
Get full text

Using FACETS to model rater training effects

238Citations
N/AReaders
Get full text

Cited by Powered by Scopus

The quality assurance of a national English writing assessment: Policy implications for quality improvement

16Citations
N/AReaders
Get full text

Analyzing rater severity in a freshman composition course using many facet Rasch measurement

9Citations
N/AReaders
Get full text

Exploring ChatGPT as a writing assessment tool

5Citations
N/AReaders
Get full text

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Cite

CITATION STYLE

APA

Ahmadi Shirazi, M. (2019). For a Greater Good: Bias Analysis in Writing Assessment. SAGE Open, 9(1). https://doi.org/10.1177/2158244018822377

Readers' Seniority

Tooltip

PhD / Post grad / Masters / Doc 19

63%

Lecturer / Post doc 5

17%

Professor / Associate Prof. 3

10%

Researcher 3

10%

Readers' Discipline

Tooltip

Linguistics 10

42%

Social Sciences 8

33%

Arts and Humanities 3

13%

Psychology 3

13%

Article Metrics

Tooltip
Mentions
News Mentions: 2

Save time finding and organizing research with Mendeley

Sign up for free