Neurophysiology of Visual-Motor Learning during a Simulated Marksmanship Task in Immersive Virtual Reality

Abstract

Immersive virtual reality (VR) systems offer flexible control of an interactive environment, along with precise position and orientation tracking of realistic movements. Immersive VR can also be used in conjunction with neurophysiological monitoring techniques, such as electroencephalography (EEG), to record neural activity as users perform complex tasks. As such, the fusion of VR, kinematic tracking, and EEG offers a powerful testbed for naturalistic neuroscience research. In this study, we combine these elements to investigate the cognitive and neural mechanisms that underlie motor skill learning during a multi-day simulated marksmanship training regimen conducted with 20 participants. On each of 3 days, participants performed 8 blocks of 60 trials in which a simulated clay pigeon was launched from behind a trap house. Participants attempted to shoot the moving target with a firearm game controller, receiving immediate positional feedback and running scores after each shot. Over the course of the 3 days that individuals practiced this protocol, shot accuracy and precision improved significantly while reaction times got significantly faster. Furthermore, results demonstrate that more negative EEG amplitudes produced over the visual cortices correlate with better shooting performance measured by accuracy, reaction times, and response times, indicating that early visual system plasticity underlies behavioral learning in this task. These findings point towards a naturalistic neuroscience approach that can be used to identify neural markers of marksmanship performance.

DOI
10.1109/VR.2018.8446068
Year