The Coen Lab

Cell and Developmental Biology department, University College London

In the natural world, brains are faced with a mixture of sensory cues from multiple modalities. For example, when listening to someone speak, we combine the sounds they produce with their lip movements–which is one reason that masks make conversations more difficult! How and where are these auditory and visual streams of information combined in the brain? Our lab uses the mouse model system to answer this question.

We train mice to perform complex behaviours in custom-designed audiovisual chambers. We combine these behaviours with the latest electrophysiology tools and optogenetic manipulations to dissect the neural circuits that underly audiovisual integration. We aim to determine how and where these sensory modalities are combined in the brain, both to localize external objects in space and to localize oneself when navigating a multisensory environment.

Join us!

Behaviour

Custom-designed acoustically transparent domes for immersive audiovisual environments

Electrophys

Chronic Neuropixels 2.0 recordings to record hundreds of neurons over weeks

Optogenetics

Perturbing brain regions during behaviour using transcranial and cannula-based approaches

Recent news