Predictive steering: Integration of artificial motor signals in self-motion estimation

The brain's computations for active and passive self-motion estimation can be unified with a single model that optimally combines vestibular and visual signals with sensory predictions based on efference copies. It is unknown whether this theoretical framework also applies to the integration of artificial motor signals, like those that occur when driving a car, or whether self-motion estimation in this situation relies on sole feedback control. Here, we examined if training humans to control a self-motion platform leads to the construction of an accurate internal model of the mapping between the steering movement and the vestibular reafference. Participants (n=15) sat on a linear motion platform and actively controlled the platform's velocity using a steering wheel to translate their body to a memorized visual target (Motion condition). We compared their steering behavior to that of participants (n=15) who remained stationary and instead aligned a non-visible line with the target (Stationary condition). To probe learning, the gain between the steering wheel angle and the platform or line velocity changed abruptly twice during the experiment. These gain changes were virtually undetectable in the displacement error in the Motion condition, whereas clear deviations were observed in the Stationary condition, showing that participants in the Motion condition made within-trial changes to their steering behavior. We conclude that vestibular feedback allows not only the online control of steering, but also a rapid adaptation to the gain changes in order to update the brain's internal model of the mapping between the steering movement and the vestibular reafference.

留言 (0)

沒有登入
gif