UCL Birkbeck MRC DTP

Investigating Audiovisual Integration in Clinical Mouse Models

Dr Philip Coen (UCL), Gonçalo Lopes, (NeuroGEARS) and Professor Caswell Barry (UCL)

This project will establish how the neural computations underlying the combination of sensory signals differ between control animals and disease models, using chronic electrophysiology to track neurons across days and environments. To interpret the world, brains combine information across sensory modalities. One prominent example is audiovisual integration—combining sounds and images. This is particularly important in social communication, where vocalisations are paired with lip movements and facial expressions. This may explain why atypical audiovisual integration is associated with clinical conditions, like autism spectrum disorder (ASD) and schizophrenia: increased communication between sensory areas could make it more difficult to extract social signals from a noisy environment. However, differences in audiovisual processing between control animals and disease models have never been evaluated. This is in part because audiovisual studies in rodents typically use head-fixed preparations to optimise stimulus presentation, but disease models are less tolerant of head-fixation. 

This PhD project will overcome that barrier by developing a unique audiovisual arena to present controlled stimuli to freely moving animals (Aim 1), and using chronic electrophysiology to determine how audiovisual processing changes across environments (Aim 2) and how it differs in ASD disease models (Aim 3).

Aim 1: A closed-loop open-arena for audiovisual experiments. 

The student will work with the industry partner, NeuroGEARS, to develop a custom arena to present spatially distributed auditory and visual stimuli to mice, triggered by the mouse position in “closed-loop” (Fig. 1a-b). This allows for control of stimulus presentation (e.g. specified spatial location and distance), comparable to our pre-existing audiovisual dome for head-fixed mice (Fig. 1a). Therefore, the same auditory and visual stimuli can be presented in both environments. 

Aim 2: Spatial audiovisual signals across environments. 

The student will be trained in the use of the “Apollo Implant” (Fig. 1c-d), a technique developed by the host laboratory to chronically implant Neuropixels 2.0 probes and record hundreds of neurons across months. By recording the same audiovisual neurons in both the head-fixed dome and open arena, the student will test the hypothesis that spatial audiovisual processing is consistent in both head-fixed and freely-moving conditions. 

Aim 3: Audiovisual integration in an ASD disease model. 

The student will record from audiovisual brain regions in Shank3+/- mice, a disease model of ASD with established deficits in social behaviour (Fig. 1e-g). Implantations will be targeted to the same brain regions recorded in Aim 2, and the same stimuli will be presented, to test the hypothesis that audiovisual signals are stronger and more prevalent in disease models of ASD.

In Summary, this project represents the first investigation of multisensory processing in rodent disease models. This is a foundational step towards understanding why these clinical conditions are linked to atypical audiovisual processing in humans, and is therefore closely aligned to the MRC Remit. Moreover, the project is made feasible through the unique skills of the industry partner, NeuroGEARS, and therefore the student will develop a unique and varied skillset, applicable to career paths in both academia and industry.

References

Bimbard, Takács, Catarino, Fabre, Gupta, Lenzi, Melin … Coen                             eLife 2025

An adaptable, reusable, and light implant for chronic Neuropixels probes

Van Beest*, Bimbard*, Fabre, Takács, Coen ... Harris & Carandini         Nature Methods 2024

Tracking neurons across days with high-density probes

Coen*, Sit*, Wells, Carandini & Harris                                                                Neuron 2023

Mouse frontal cortex mediates additive multisensory decisions

Feldman, Dunham, Casssidy, … Woynaroski      Neuroscience & Biobehavioral Reviews 2018

Audiovisual multisensory integration in individuals with autism spectrum disorder: A systematic review and meta-analysis

Feldman, Dunham, Casssidy, … Woynaroski      Neuroscience & Biobehavioral Reviews 2018

Audiovisual multisensory integration in individuals with autism spectrum disorder: A systematic review and meta-analysis

Uchino & Waga                                                                        Brain and Development 2013

SHANK3 as an autism spectrum disorder-associated gene

Application Display
On