Skip to yearly menu bar Skip to main content


Poster

Maximally discriminative stimuli for functional cell type identification

Max F. Burg · Thomas Zenkel · Michaela Vystrčilová · Jonathan Oesterle · Larissa Höfling · Konstantin F. Willeke · Jan Lause · Sarah Müller · Paul Fahey · Zhiwei Ding · Kelli Restivo · Shashwat Sridhar · Tim Gollisch · Philipp Berens · Andreas Tolias · Thomas Euler · Matthias Bethge · Alexander S Ecker

Halle B
[ ]
Tue 7 May 7:30 a.m. PDT — 9:30 a.m. PDT

Abstract:

Identifying cell types and understanding their functional properties is crucial for unraveling the mechanisms underlying perception and cognition. For example in the retina, functional types can be identified by a carefully selected and manually curated battery of stimuli. However, this requires expert domain knowledge and biases the procedure towards previously known cell types. In the visual cortex, it is still unknown what functional types exist and how to identify them. Thus, for unbiased identification of the functional cell types in retina and visual cortex, new approaches are needed. Here we propose an optimization-based clustering approach using deep predictive models to obtain functional clusters of neurons using Maximally Discriminative Stimuli (MDS). Our approach alternates between stimulus optimization with cluster reassignment akin to an expectation-maximization algorithm. The algorithm recovers functional clusters in mouse retina, marmoset retina and macaque visual area V4. This demonstrates that our approach can successfully find discriminative stimuli across species, stages of the visual system and recording techniques. Presenting maximally discriminative stimuli during data acquisition allows for on-the-fly assignment to functional cell types, and paves the way for experiments that were previously limited by experimental time. Crucially, MDS are interpretable: they visualize the distinctive stimulus patterns that most unambiguously identify a specific type of neuron. We will make our code avail- able online upon publication.

Chat is not available.