The brain makes use of noisy sensory inputs to produce eye, head, or arm motion. In most instances, the brain combines this sensory information with predictions about future events. Here, we propose that Kalman filtering can account for the dynamics of both visually guided and predictive motor behaviors within one simple unifying mechanism. Our model relies on two Kalman filters: (1) one processing visual information about retinal input; and (2) one maintaining a dynamic internal memoryof target motion. The outputs of both Kalman filters are then combined in a statistically optimal manner, i.e., weighted with respect to their reliability. The model was tested on data from several smooth pursuit experiments and reproduced all major characteristics of visually guided and predictive smooth pursuit. This contrasts with the common belief that anticipatory pursuit, pursuit maintenance during target blanking, and zero-lag pursuit of sinusoidally moving targets all result from different control systems. This is the first instance of a model integrating all aspects of pursuit dynamics within one coherent and simple model and without switching between different parallel mechanisms. Our model suggests that the brain circuitry generating a pursuit command might be simpler than previously believed and only implement the functional equivalents of two Kalman filters whose outputs are optimally combined. It provides a general framework of how the brain can combine continuous sensory information with a dynamic internal memory and transform it into motor commands.
written by Jean-Jacques Orban de Xivry
Scientist in the motor control field.