One major open question in neuroscience is "How is sensory experience encoded by neurons in the brain?" Despite decades of research on this topic, our understanding of sensory encoding is only now beginning to yield answers that apply to real world sensory experience.
Berkeley Neuroscience labs are addressing this question across multiple scales, animal models, and technical approaches. For example, the Gallant Lab uses fMRI studies to precisely map responses to natural visual scenes (movie clips) and language (podcasts) in the human brain, while the Scott Lab uses genetic and optical tools to identify the neural circuits that link taste sensation and behavior in fruit flies.
In a study published this month in Neuron, the Feldman Lab has taken our understanding of sensory encoding farther by looking at how neurons in the brain’s cerebral cortex encode the shape and texture of objects. They found that shape and texture were encoded by the same neurons on different time scales - shape was instantaneously encoded as a brief burst of neuron firing, and texture was encoded by changes in average neuron firing rate over time.
This finding reveals how the brain multiplexes different types of sensory information in the same neuronal firing patterns, providing an important clue to the nature of the neural code and neural computation.
First author and PhD Program alum (2010-2017) Brian Isett helps us understand the importance of this research discovery.