Understanding the molecular mechanisms underlying the formation of selective intracortical circuitry is one of the important questions in neuroscience research. "Barrel nets" are recently identified intracortical axonal trajectories derived from layer 2/3 neurons in layer 4 of the primary somatosensory (barrel) cortex. Axons of layer 2/3 neurons are preferentially distributed in the septal regions of layer 4 of the barrel cortex, where they show whisker-related patterns. Because cadherins have been viewed as potential candidates that mediate the formation of selective neuronal circuits, here we examined the role of cadherins in the formation of barrel nets. We disrupted the function of cadherins by expressing dominant-negative cadherin (dn-cadherin) using in utero electroporation and found that barrel nets were severely disrupted. Confocal microscopic analysis revealed that expression of dn-cadherin reduced the density of axons in septal regions in layer 4 of the barrel cortex. We also found that cadherins were important for the formation, rather than the maintenance, of barrel nets. Our results uncover an important role of cadherins in the formation of local intracortical circuitry in the neocortex.
The use of head fixation has become routine in systems neuroscience. However, whether the behavior changes with head fixation, whether animals can learn aspects of a task while freely moving and transfer this knowledge to the head fixed condition, has not been examined in much detail. Here, we used a novel floating platform, the “Air-Track”, which simulates free movement in a real-world environment to address the effect of head fixation and developed methods to accelerate training of behavioral tasks for head fixed mice. We trained mice in a Y maze two choice discrimination task. One group was trained while head fixed and compared to a separate group that was pre-trained while freely moving and then trained on the same task while head fixed. Pre-training significantly reduced the time needed to relearn the discrimination task while head fixed. Freely moving and head fixed mice displayed similar behavioral patterns, however, head fixation significantly slowed movement speed. The speed of movement in the head fixed mice depended on the weight of the platform. We conclude that home-cage pre-training improves learning performance of head fixed mice and that while head fixation obviously limits some aspects of movement, the patterns of behavior observed in head fixed and freely moving mice are similar.
Abstract Neural activity across the dorsal neocortex of rodents is dominated by orofacial and limb movements, irrespective of whether the movements are task-relevant or task-irrelevant. To examine the extent to which movements and a primitive cognitive signal, i.e., reward expectancy, modulate the activity of multiple cortical areas in primates, we conducted unprecedented wide-field one-photon calcium imaging of frontoparietal and auditory cortices in common marmosets while they performed a classical conditioning task with two auditory cues associated with different reward probabilities. Licking, eye movement, and hand movement strongly modulated the neuronal activity after cue presentation in the motor and somatosensory cortices in accordance with the somatotopy. By contrast, the posterior parietal cortex and primary auditory cortex did not show much influence from licking. Licking increased the activity in the caudal part of the dorsal premotor cortex, but decreased the activity in the central and lateral parts of the rostral part of the dorsal premotor cortex (PMdr). Reward expectancy that was separable from both spontaneous and goal-directed movements was mainly represented in the medial part of PMdr. Our results suggest that the influence of movement on primate cortical activity varies across areas and movement types, and that the premotor cortex processes motor and cognitive information in different ways within further subdivided areas.
A central function of the brain is to plan, predict, and imagine the effect of movement in a dynamically changing environment. Here we show that in mice head-fixed in a plus-maze, floating on air, and trained to pick lanes based on visual stimuli, the asymmetric movement, and position of whiskers on the two sides of the face signals whether the animal is moving, turning, expecting reward, or licking. We show that (1) whisking asymmetry is coordinated with behavioral state, and that behavioral state can be decoded and predicted based on asymmetry, (2) even in the absence of tactile input, whisker positioning and asymmetry nevertheless relate to behavioral state, and (3) movement of the nose correlates with asymmetry, indicating that facial expression of the mouse is itself correlated with behavioral state. These results indicate that the movement of whiskers, a behavior that is not instructed or necessary in the task, can inform an observer about what a mouse is doing in the maze. Thus, the position of these mobile tactile sensors reflects a behavioral and movement-preparation state of the mouse. SIGNIFICANCE STATEMENT Behavior is a sequence of movements, where each movement can be related to or can trigger a set of other actions. Here we show that, in mice, the movement of whiskers (tactile sensors used to extract information about texture and location of objects) is coordinated with and predicts the behavioral state of mice: that is, what mice are doing, where they are in space, and where they are in the sequence of behaviors.
Computer vision approaches have made significant inroads into offline tracking of behavior and estimating animal poses. In particular, because of their versatility, deep-learning approaches have been gaining attention in behavioral tracking without any markers. Here, we developed an approach using DeepLabCut for real-time estimation of movement. We trained a deep-neural network (DNN) offline with high-speed video data of a mouse whisking, then transferred the trained network to work with the same mouse, whisking in real-time. With this approach, we tracked the tips of three whiskers in an arc and converted positions into a TTL output within behavioral time scales, i.e., 10.5 ms. With this approach, it is possible to trigger output based on movement of individual whiskers, or on the distance between adjacent whiskers. Flexible closed-loop systems like the one we have deployed here can complement optogenetic approaches and can be used to directly manipulate the relationship between movement and neural activity.
Abstract Navigation through complex environments requires motor planning, motor preparation and the coordination between multiple sensory–motor modalities. For example, the stepping motion when we walk is coordinated with motion of the torso, arms, head and eyes. In rodents, movement of the animal through the environment is often coordinated with whisking. Here we trained head fixed mice – navigating a floating Airtrack plus maze – to overcome their directional preference and use cues indicating the direction of movement expected in each trial. Once cued, mice had to move backward out of a lane, then turn in the correct direction, and enter a new lane. In this simple paradigm, as mice begin to move backward, they position their whiskers asymmetrically: whiskers on one side of the face protract, and on the other side they retract. This asymmetry reflected the turn direction. Additionally, on each trial, mice move their eyes conjugately in the direction of the upcoming turn. Not only do they move their eyes, but saccadic eye movement is coordinated with the asymmetric positioning of the whiskers. Our analysis shows that the asymmetric positioning of the whiskers predicts the direction of turn that mice will make at an earlier stage than eye movement does. We conclude that, when mice move or plan to move in complex real-world environments, their motor plan and behavioral state can be read out in the movement of both their whiskers and eyes. Significance statement Natural behavior occurs in multiple sensory and motor dimensions. When we move through our environment we coordinate the movement of our body, head, eyes and limbs. Here we show that when mice navigate a maze, they move their whiskers and eyes; they position their whiskers asymmetrically, and use saccadic eye movements. The position of the eyes and whiskers predicts the direction mice will turn in. This work suggests that when mice move through their environment, they coordinate the visual-motor and somatosensory-motor systems.