Measuring visually guided motor performance in ultra low vision using virtual reality

0Citations
Citations of this article
4Readers
Mendeley users who have this article in their library.

Abstract

Introduction: Ultra low vision (ULV) refers to profound visual impairment where an individual cannot read even the top line of letters on an ETDRS chart from a distance of 0.5 m. There are limited tools available to assess visual ability in ULV. The aim of this study was to develop and calibrate a new performance test, Wilmer VRH, to assess hand-eye coordination in individuals with ULV. Methods: A set of 55 activities was developed for presentation in a virtual reality (VR) headset. Activities were grouped into 2-step and 5-step items. Participants performed a range of tasks involving reaching and grasping, stacking, sorting, pointing, throwing, and cutting. Data were collected from 20 healthy volunteers under normal vision (NV) and simulated ULV (sULV) conditions, and from 33 participants with ULV. Data were analyzed using the method of successive dichotomizations (MSD), a polytomous Rasch model, to estimate item (difficulty) and person (ability) measures. MSD was applied separately to 2-step and 5-step performance data, then merged to a single equal interval scale. Results: The mean (Formula presented.) SD of completion rates were 98.6 (Formula presented.) 1.8%, 78.2 (Formula presented.) 12.5% and 61.1 (Formula presented.) 34.2% for NV, sULV and ULV, respectively. Item measures ranged from −1.09 to 5.7 logits and − 4.3 to 4.08 logits and person measures ranged from −0.03 to 4.2 logits and −3.5 to 5.2 logits in sULV and ULV groups, respectively. Ninety percent of item infits were within the desired range of [0.5,1.5], and 97% of person infits were within that range. Together with item and person reliabilities of 0.94 and 0.91 respectively, this demonstrates unidimensionality of Wilmer VRH. A Person Item map showed that the items were well-targeted to the sample of individuals with ULV in the study. Discussion: We present the development of a calibrated set of activities in VR that can be used to assess hand-eye coordination in individuals with ULV. This helps bridge a gap in the field by providing a validated outcome measure that can be used in vision restoration trials that recruit people with ULV, and to assess rehabilitation outcomes in people with ULV.

Cite

CITATION STYLE

APA

Kartha, A., Sadeghi, R., Bradley, C., Livingston, B., Tran, C., Gee, W., & Dagnelie, G. (2023). Measuring visually guided motor performance in ultra low vision using virtual reality. Frontiers in Neuroscience, 17. https://doi.org/10.3389/fnins.2023.1251935

Register to see more suggestions

Mendeley helps you to discover research relevant for your work.

Already have an account?

Save time finding and organizing research with Mendeley

Sign up for free