The Coen Lab
Cell and Developmental Biology department, University College London
In the natural world, brains are faced with a mixture of sensory cues from multiple modalities. For example, when listening to someone speak, we combine the sounds they produce with their lip movements–which is one reason that masks make conversations more difficult! How and where are these auditory and visual streams of information combined in the brain? Our lab uses the mouse model system to answer this question.
We train mice to perform complex behaviours in custom-designed audiovisual chambers. We combine these behaviours with the latest electrophysiology tools and optogenetic manipulations to dissect the neural circuits that underly audiovisual integration. We aim to determine how and where these sensory modalities are combined in the brain, both to localize external objects in space and to localize oneself when navigating a multisensory environment.
Recent news
“Apollo Implant” version of record at eLife
The version of record for our eLife paper is now live! A reusable, flexible, and lightweight chronic implant for Neuropixels probes.
Superior Colliculus symposium at SFN 2024
Flóra and Pip co-chair a symposium at SFN 2024 in Chicago: "Is there anything the superior colliculus doesn't do?" on October 8th at 2pm
“UnitMatch” paper published in Nature Methods
UnitMatch, a software package for tracking neurons across time is now published in Nature Methods!
“Apollo Implant” reviewed preprint is now live at eLife
Our eLife paper is now live! A reusable, flexible, and lightweight chronic implants for Neuropixels probes.
Pip Awarded 2024 UCL Early Career Neuroscience Prize
Pip receives the Early Career Neuroscience Prize for his 2023 paper (co-authored with Tim Sit): Mouse frontal cortex mediates additive multisensory decisions
Multisensory symposium at FENS 2024
Pip and Seung-Hee Lee are chairing the symposium "Multisensory decisions across species and modalities" on June 26 at FENS 2024