Back to search

FRIMEDBIO-Fri prosj.st. med.,helse,biol

Cortical representation of unrestrained whole-body motion in 3D

Alternative title: Cortical representation of unrestrained whole-body motion in 3D

Awarded: NOK 7.0 mill.

More than a century of work has shown that purposeful movements of the body result from neural activity spanning several regions of the brain, including posterior parietal cortex (PPC) and frontal motor areas, but the field still lacks an understanding of how these areas formulate natural behavior in 3D. To address this, this RCN project recorded neural ensemble activity in freely moving rodents while tracking their head and back in 3D. One series of experiments utilized mice. To net the largest datasets possible, we used miniature head-mounted microscopes that recorded neural activity indicated as flashes of light. That is, cortical neurons were genetically modified to fluoresce when they were most active?a technique known as calcium imaging. By using calcium imaging, we were able to record hundreds (200-600) of cells at a time in naturally behaving animals, and found that nearly half the neurons in PPC encoded features of 3D posture, such as pitch, azimuth and roll of the head. Parallel recordings in rats used silicon probes for electrophysiological recordings. This approach has a much higher temporal resolution than calcium imaging (i.e. it records single-spikes), but with fewer neurons at a time (10-60 cells). In the end, we recorded >1500 cells from nearly a dozen rats, which showed that the majority of cells in PPC and the frontal motor area with which it connects, M2, encode 3D posture. We also found that representations of the head and back were organized topographically, and that spiking activity in PPC tended to precede M2, suggesting a network-level organization across areas. These experiments were reported in Science in November of 2018, with a ?Perspective? piece written about it as well. For both the rat and mouse datasets, we applied machine learning approaches that used the 3D tracking data to label specific behaviors, such as rearing, running, turning, grooming, etc. These analyses are continuing in the post-project period, but so far we find that the tuning of the neurons fit into ca. 20 different behavioral categories (such as those listed above). We are also finding that the behavioral tuning of the cells is the same regardless of which task the animals perform?including a free foraging task, and a memory-guided navigational task. So far, this indicates that PPC and M2 are driven mainly by the physical features of the animal?s behavior, regardless of whether behavior is spontaneous or goal-directed. Determining whether cortex re-tuned across tasks was another major objective in this RCN project. On the whole, this work has shown that PPC and M2 primarily encode 3D posture (as opposed to movement), that this form of neural tuning is present across species, that it can be measured using different recording methodologies, and that it does not appear to change across behavioral tasks. Lastly, recordings in the final weeks of 2018 produced the first evidence that postural tuning may actually be a general feature of cortical coding?with preliminary datasets showing 3D postural tuning in primary motor cortex (M1), primary somatosensory cortex (S1) and primary visual cortex (V1), in darkness. We have also open-sourced the software for the 3D tracking platform generated during this project, as well as datasets from rats and mice. Through studying how the brains of these animals represent natural patterns of behavior in 3D, we hope to uncover general neural coding principles of goal-directed motor behavior which could be used, for example, to improve the efficiency of brain-controlled prosthetic devices in patients suffering from paralysis. The data obtained from this work could also be applied in the optimization of biologically-inspired robotics platforms-- a field which is intent on producing machines to assist humans in hazardous environments and rescue situations.

The interdisciplinary nature of the project lead to a major cross-fertilization of knowledge in topics including computer science, machine learning, statistical analyses, neurophysiology, neural imaging, animal behavior, and behavioral analysis. I would say the biggest outcome in terms of personnel is this transfer of knowledge-- that biologists (the PhD students in my lab) received a substantial education in quantitative methods, while the computer & data scientists (Benjamin Dunn, and our collaborator Graham Taylor) learned a great deal of biology. The most immediate impact of the work is that it sets an example for what can be learned by applying thorough, quantitative analyses of animal behavior. Studying how the brain encodes behavior in 3D is an almost untouched area in neuroscience, and the Science paper will help light the way for future work investigating the how neural activity in the brain relates to the body and the external world.

A fundamental question in neuroscience is how volitional actions emerge from neural circuits in the brain. Answers to this and other 'big' questions will begin to unfurl in the 21st century thanks to improved data acquisition techniques which permit the sampling of hundreds to thousands of cells simultaneously. A looming problem, however, is that datasets will soon grow so large that a new generation of analytical tools will also be required. The central aim of this project, to understand how parietal cortical ensemble activity maps onto the freely-behaving body, will make use of recent advances in markerless 3D motion tracking and large-scale in vivo calcium imaging in freely behaving mice. In order to analyze datasets of this scale, we will develop novel machine learning algorithms to correlate neuronal activity patterns with key movement features such as changes in joint angles. The major goals of the project include the synchronization of neural and behavioral recordings from mice foraging in an open arena. Once sufficient data is collected, the next goal will be to implement 'deep belief' machine learning algorithms which will be trained to recognize patterns between the neural and behavioral datasets. To this end, we have the key advantage of collaborating with a field-leading expert in deep belief algorithms who focuses on modeling biological movement. Finally, we will determine if we can apply our new tools to accurately foretell an animal's autonomously chosen trajectory when it is presented with a navigational goal. By imaging neural activity during a variety of behaviors, we aim to make a quantum leap forward in understanding the cortical representation of a wide spectrum of whole-body movements which comprise an animal's waking behavior. The computational tools developed in the process could be applied to study any number of behaviors in wild-type or disease-model mice, and could even be applied in developing clinical neural prostheses.

Publications from Cristin

No publications found

No publications found

No publications found

No publications found

Funding scheme:

FRIMEDBIO-Fri prosj.st. med.,helse,biol