David Schneider

Scholar: 2019

Awarded Institution
Assistant Professor
New York University
Center For Neural Science


Research Interests

Integrative Circuits for using Predictions to Guide Behavior 

We constantly make predictions, ranging from whether it will rain today to who will be the next president. But some of the most important predictions that we make are much less obvious, such as what my voice will sound like when I speak or what the next note will sound like when I strike a key on the piano. The ability to anticipate the sounds of our actions is vital for learning and maintaining complex behaviors such as speech and musicianship. Our research mission is to understand how the brain makes predictions about self-generated sounds and how it uses these predictions to guide our behaviors.

To understand how the brain makes predictions, we study a simple model organism, the mouse. Despite the colloquial phrase “as quiet as a mouse,” mice actually make a lot of different sounds (and their ears are huge!). We study how the mouse brain learns the sounds of its own actions by studying both natural sound-generating behaviors (such as vocalizing) and artificial lab-based behaviors that produce experimentally controlled sounds (acoustic augmented reality). Our research focuses on how two parts of the brain – the motor cortex, a part of the brain that is active during movement, and the auditory cortex, an area involved in the conscious perception of sound – work together to learn and anticipate the sound a movement will make. We combine tools for monitoring and manipulating the activity of neurons at cellular, synaptic, and molecular levels as mice engage in sound-generating behaviors. Our goal is to fully understand how memories about self-generated sounds are formed, stored, and recalled in the brain to help mice make accurate predictions about the future.

We also care deeply about how predictions can be used to help mice and other animals learn and maintain complex behaviors. For example, once a mouse has learned the sound a movement should make, it can use that prediction to detect errors, much like how a pianist has the instant realization when they strike the wrong key on a piano. We are working to understand how the mouse brain routes error signals, which are computed in hearing centers such as the auditory cortex, to motor centers of the brain, where they can be used to correct and update future behaviors. Overall, we believe that this research affords us an opportunity to understand how behaviors reminiscent of those that we hold dear for social communication – like speech and musicianship – are learned and updated by circuits in the brain.