Neuronal Mechanisms of Attention
The phrase “visual attention” captures a broad range of phenomena that describe the process employed by our brains to capture the most relevant information from the vast amounts of visual information bombarding our senses at any given time. When we study visual attention in the laboratory, we mainly study covert visual spatial attention, or the allocation of attention to a particular region of visual space that is away from the center of gaze. Covert visual spatial attention is probably an important emergent property in social animals, including humans, that care about evaluating facial expressions but are wary of making direct eye contact. We design visual attention tasks that involve shifts in the locus of covert spatial attention on different trials. Subjects are cued to attend to particular spatial locations and are rewarded for detecting subtle changes in visual stimulus features at attended locations. In order to assess attention behavior, the cue is occasionally invalid, i.e. the stimulus change occurs at the un-cued location. On invalidly cued trials, subjects either miss the stimulus change all together, or react more slowly when detecting a stimulus change at an un-cued location. This type of cueing paradigm is called Posner cueing (Posner et al., 1980).
Covert spatial attention tasks have been utilized to study the impact of attention on neuronal activity throughout the visual system. Visual attention directed toward a stimulus overlapping the receptive fields of recorded neurons tends to increase the activity or firing rate of those neurons compared to conditions where the same stimulus is in the receptive field, but subjects are attending elsewhere. This attentional modulation of neuronal firing rate is characterized by an attention index, which is a normalized measure of the change in neuronal firing rates across attention conditions (e.g. attending toward versus attending away from the stimulus in the receptive field). Interestingly, attentional modulation of neuronal firing rate scales along the visual hierarchy, with more robust attentional effects observed in higher visual structure areas like MT, V4, and FEF, and relatively modest attentional effects observed in the visual thalamus (LGN) and the primary visual cortex (V1). However, in all of these visual structures, there is large variability in the attentional effects observed across individual neurons. This variability suggests that there may be rules governing whether and how much attention alters the activity of distinct sub-populations of neurons. Furthermore, attentional modulation of firing rate may not be the best readout of attention as the attention index is measured over long time scales (1-2 seconds). We have shown that attentional modulation of neuronal activity is dynamic (Mock et al, 2018), and attention can alter the efficacy of communication between neurons in the early visual pathways (Briggs et al., 2013; Hembrook-Short et al, 2019). We are currently pursuing further explorations of the underlying biophysical, cellular, and circuit mechanisms by which attention alters neuronal activity.
Our ongoing research is motivated by another recent study from our lab that highlights the diversity of attention effects observed in a single visual cortical area (Hembrook-Short et al., 2017). Specifically, we discovered that attentional modulation of V1 neurons is not uniform within the attended spatial region. Instead, attention selectively increases the firing rate of neurons that encode stimulus features that are important for successful completion of the task. These findings suggest that a general attention spotlight or spatial gain model of attention is too limited. Instead, models of attention must incorporate both spatial and feature attention components in order to accurately capture the variability in attentional modulation of neuronal activity, even for neurons responsive to the same attended spatial location. Ongoing work will test this hypothesis explicitly by recording from the same neuronal populations while subjects perform multiple attention tasks involving detection of different visual stimulus features. We predict that attention will selectively affect the responses of neurons tuned for task-relevant features. Importantly, by studying attention effects across different attention tasks and across different brain regions, we can further establish the generalizability of the rules governing attentional modulation.