
The Keller group had previously shown that the coupling of sensory and motor experience was critical for normal sensorimotor processing. The researchers have now identified the particular subset of cortical neurons responsible for computing mismatches between what we expect to see based on movement and what we actually see, and are likely essential for helping us distinguish between self-generated and externally generated sensations.
When we move our head to the right, the visual scene shifts to the left. But we do not perceive the world to be moving; our brain has learnt to differentiate between sensory information caused by our own actions (we moved our eyes) or the external world (a car drove by). It is still unclear, however, how the brain tells the difference between self-generated and externally generated sensations.
The neurobiologists in the Keller group had previously shown that the primary visual cortex uses information about movement to predict the resulting visual feedback. It compares this prediction with the actual visual feedback that occurs to detect any mismatches. In 2017 studies, they described the circuit for the integration of visual and motor inputs in visual cortex.
Rebecca Jordan, postdoc in the Keller group, investigated this mechanism further. In particular, she wanted to understand in detail how individual neurons compute mismatch responses, and whether this computation was performed by a specific subset of neurons.
To investigate this, Jordan set up a study in which mice run through a virtual reality tunnel, where running was coupled to visual feedback. Brief pauses in the visual feedback were used to evoke a mismatch between the actual visual feedback and what the mouse expected to see while running. At the same time, Jordan made intracellular recordings of the electrical activity of individual neurons in the mouse visual cortex - a powerful technique for assessing neuronal computations. In a second phase of the recordings, she also decoupled visual feedback from running to identify the independent effects of each source of information (visual or running) on neural activity.
Jordan identified an abundance of neurons in the superficial layers of the visual cortex performing a particular computation. “These neurons were particularly responsive to differences between running and visual feedback,’ Jordan explains. “This suggests that they are specialized in detecting mismatches between what we expect to see based on movement, and what we actually see.’ Jordan observed two kinds of these comparator neurons: those that signal when the visual feedback was more than expected, and those that signal when the visual feedback was less than expected. Neurons deeper in the visual cortex, however, did not appear to perform this comparative computation, suggesting that different cortical layers perform distinct computations.
“These findings advance our understanding of how the cortex processes sensory and motor information,’ concludes Georg Keller. “A deep comprehension of how the brain uses this information to distinguish self-generated from externally generated sensations will help us understand, for example, what has gone wrong in certain psychiatric disorders such as schizophrenia - a disease in which this process appears to be disrupted.’
Original publication:
Rebecca Jordan, Georg B. Keller. Opposing influence of top-down and bottom-up input on excitatory layer 2/3 neurons in mouse primary visual cortex Neuron (2020). Advance online publication
About the first author
Rebecca Jordan was born in Devon, UK and did her PhD in the lab of Andreas Schaefer at the Francis Crick Institute in London. She joined the FMI in Dec. 2018 as a postdoc. Her previous work had demonstrated that activity related to movements can cause problems for sensory coding in the olfactory bulb and she wanted to study how the cortex might normalise for movements instead. With the Keller group, she found the perfect place for this kind of research! In her spare time, Rebecca likes to paint surreal acrylic paintings.