Speakers

 

Florian Engert

Nature over Nurture: Functional neuronal circuits emerge in the absence of developmental activity

During development, the complex neuronal circuitry of the brain arises from limited information contained in the genome. After the genetic code instructs the birth of neurons, the emergence of brain regions, and the formation of axon tracts, it is believed that neuronal activity plays a critical role in shaping circuits for behavior. Current AI technologies are modeled after the same principle: connections in an initial weight matrix are pruned and strengthened by activity-dependent signals until the network can sufficiently generalize a set of inputs into outputs. Here, we challenge these learning-dominated assumptions by quantifying the contribution of neuronal activity to the development of visually guided swimming behavior in larval zebrafish. Intriguingly, dark-rearing zebrafish revealed that visual experience has no effect on the emergence of the optomotor response (OMR). We then raised animals under conditions where neuronal activity was pharmacologically silenced from organogenesis onward using the sodium-channel blocker tricaine. Strikingly, after washout of the anesthetic, animals performed swim bouts and responded to visual stimuli with 75% accuracy in the OMR paradigm. After shorter periods of silenced activity OMR performance stayed above 90% accuracy, calling into question the importance and impact of classical critical periods for visual development. Detailed quantification of the emergence of functional circuit properties by brain-wide imaging experiments confirmed that neuronal circuits came ‘online’ fully tuned and without the requirement for activity-dependent plasticity. Thus, contrary to what you learned on your mother's knee, complex sensory guided behaviors can be wired up innately by activity-independent developmental mechanisms.

Emily Mackevicius

Circuit priors for learning ethologically relevant bird behavior

Behaviors emerge via a combination of experience and innate predispositions. I will present a framework for understanding how pre-existing neural circuit priors form a substrate on which the brain can learn new information. I will apply this framework to neural data I collected from the brains of birds performing ethologically relevant learning and memory behaviors. Specifically, Zebra Finches learning to sing a new song, and Black-Capped Chickadees remembering where they have hidden food caches. In both cases, in very different domains, neural circuits appear to generate an abstract state-space representation of the task, upon which new memories can be learned. This suggests mechanisms through which the brain may reason about the world in a Bayesian way, combining structured pre-existing information with new observations and experiences.


Josie Clownie

Hacking brain development to test models of sensory coding

Animals can discriminate myriad sensory stimuli but can also generalize from learned experience. You can probably distinguish the favorite teas of your colleagues while still recognizing that all tea pales in comparison to coffee. Tradeoffs between detection, discrimination, and generalization are inherent at every layer of sensory processing. During development, specific quantitative parameters are wired into perceptual circuits and set the playing field on which plasticity mechanisms play out. A primary goal of systems neuroscience is to understand how material properties of a circuit define the logical operations— computations--that it makes, and what good these computations are for survival. A cardinal method in biology—and the mechanism of evolution--is to change a unit or variable within a system and ask how this affects organismal function. In this presentation, I will discuss how we are making use of our knowledge of developmental wiring mechanisms to modify hard-wired circuit parameters in the Drosophila melanogaster mushroom body. We assess odor representations and learning in these circuit-modified animals to test the relationship between form and function. By altering the number of expansion layer neurons (Kenyon cells) and their dendritic complexity, we find that input number, but not cell number, tunes odor selectivity. Simple odor discrimination performance is maintained when Kenyon cell number is reduced and augmented by Kenyon cell expansion.

Javier Valdes Aleman

Comparative connectomics reveals mechanisms for synaptic specificity

Our nervous system is made of billions of neurons that process sensory information and control behavior. It is delicately organized into circuits with specifically tuned cell-to-cell connections that are essential for proper function. During development, neurons project to remote locations in search of their synaptic partners. Surrounded by numerous cells along their trajectory, these developing neurons are challenged to connect only with a specific set of cells. But how do neural circuits achieve such precise connectivity?

 

I study the assembly of a circuit for mechanosensation in Drosophila larva. I genetically manipulated the growth patterns of mechanosensory neurons to change their position in the central nervous system. Using electron microscopy, I reconstructed their morphology and mapped their synapses. I found that their partner neurons manage to connect to the mechanosensory cells at their new location by extending ectopic branches. Despite this compensation, the connectivity balance of the circuit is disturbed, resulting in a deficient mechanosensory behavior. Similarly, abolishing synaptic communication during development in these neurons results in a connectivity imbalance with long-lasting behavioral defects. These results suggest that partner-derived cues are sufficient for the recognition between specific partners and establishment of connections. However, without orderly positioning of axon terminals by positional cues and without synaptic activity during embryonic development, the number and strength of functional connections are altered with significant consequences for behavior. Thus, multiple mechanisms including global positional cues, partner-derived cues, and synaptic activity contribute to proper circuit assembly in the developing Drosophila nerve cord.

Allyson Sgro

The emergence of collective computations in non-neural multicellular systems

TBA

Stan Kerstjens

Constructive Connectomics

During brain development, billions of axons navigate over long distances to form functional circuits. The absence of an explicit blueprint with an external reference frame and global coordinate space raises the question of how such self-construction can be coordinated. The question is sharpened by the limited information capacity of the genome, which discounts strategies that rely on the explicit encoding of billions of axon trajectories. We propose and validate a mechanism of development that can provide an efficient solution to the global wiring task. The key principle is that constraints inherent to mitosis implicitly induce a global hierarchical patterning of nested brain regions, each marked by a unique gene expression profile: If mitotic daughters have, on average, similar gene expression to their parent, and do not stray too far from one another; then the progenies of successive progenitors occupy nested and contiguous brain regions, and have an average expression that approximates their common progenitor, effectively embedding the global lineage tree in physical and expression space. Axons could exploit this organization for navigation by using the epigenetic landscape, which encodes the global lineage tree of expression states in every neuron's genome, as a roadmap to generate guidepost profiles for each leg of their journey to their targets. We confirm the operation of this mechanism through simulation; and have analyzed gene expression data of developing and adult mouse brains (sourced from the Allen Institute for Brain Science), and of larval zebrafish, and found them consistent with our simulations: gene expression indeed partitions the brain into a global spatial hierarchy of nested regions that is stable over pre- and postnatal time. Simulation of axon growth on these experimental data demonstrate that axons can indeed navigate robustly over long distances to specific targets, and generate a qualitatively plausible connectome.


Naomi Shvedov

Superdiffuse neuron migration in the postnatal brain

Songbirds generate new neurons that disperse widely throughout their brains throughout life. How these newborn neurons migrate through the adult brain tissue is unclear. We performed two-photon time lapse microscopy in GFP-expressing transgenic zebra finches to visualize migratory dynamics of cell populations in vivo. We tracked hundreds of migrating cells in the dorsal forebrain of awake animals for up to 12 hours. Migratory neurons dispersed in all directions and made frequent course changes, a pattern observed across male and females, juveniles and adults. Cells did not appear to follow blood vessels exclusively and histological analysis revealed that contact with radial glia was infrequent. Furthermore, cells appeared to move independently, and were largely uncorrelated in their speed, position and direction. Importantly, we found that migration dynamics across ages and brain regions were well fit by a superdiffusive model. These behaviors may reflect a specialized form of migration that enables neurons to flexibly navigate through the dense, functional circuitry of the postnatal brain.




Toni Zador

Encoding innate ability through a genomic bottleneck

Animals are born with extensive innate behavioral capabilities, which arise from neural circuits encoded in the genome. However, the information capacity of the genome is orders of magnitude smaller than that needed to specify the connectivity of an arbitrary brain circuit, indicating that the rules encoding circuit formation must fit through a “genomic bottleneck” as they pass from one generation to the next. Here we formulate the problem of innate behavioral capacity in the context of artificial neural networks in terms of lossy compression of the weight matrix. We find that several standard network architectures can be compressed by several orders of magnitude, yielding pre-training performance that can approach that of the fully-trained network. Interestingly, for complex but not for simple test problems, the genomic bottleneck algorithm also captures essential features of the circuit, leading to enhanced transfer learning to novel tasks and datasets. Our results suggest that compressing a neural circuit through the genomic bottleneck serves as a regularizer, enabling evolution to select simple circuits that can be readily adapted to important real-world tasks. The genomic bottleneck also suggests how innate priors can complement conventional approaches to learning in designing algorithms for artificial intelligence.