Skip to main content
menu
URMC / Labs / Romanski Lab / Projects / Mechanisms Which Underlie Face-vocalization Integration in VLPFC

Mechanisms Which Underlie Face-vocalization Integration in VLPFC

The perception and integration of congruent communication stimuli is necessary for appropriate evaluation and comprehension of an audio-visual message. Our studies have shown that there are several types of multisensory interactions: linear and non-linear; enhanced and inhibitory (Sugihara et al., 2006). There are a number of factors that affect sensory integration including temporal coincidence and stimulus congruency, which are thought to underlie the successful merging of two intermodal stimuli into a coherent perceptual representation which is especially important in speech perception. We have begun to explore the role of the prefrontal cortex in encoding congruent face-vocalization stimuli in order to understand the essential components of face-vocalization integration. To this end we have examined changes in neural activity when face-vocalization pairs are mismatched and presented either during fixation or in an audio-visual non-match-to-sample task. Our data indicates that non-human primates can detect these mismatches and that single cells in VLPFC display changes in neuronal firing to incongruent and to temporally offset face-vocalization stimuli compared to congruent audiovisual stimuli. Continued analysis and recordings are aimed at further defining the role of the VLPFC in the integration of audio-visual face and vocalization information for the purpose of communication.

Examples of Multisensory Interactions in PFC Neurons

Types of A/V Interactions in PF Neurons. The raster/histogram responses from one cell (top) and a second cell (bottom) are shown with the corresponding bar graph of the response for each at the left. The top row shows an example of a single cell which exhibited multisensory enhancement where the response to the combined face+vocalization (AV) condition is increased above that of either the face or the voice conditions (A or V). When this enhancement is greater than the linear sum of the two unimodal conditions being combined it is said to be super additive. In the bottom row a different cell had a suppressed response to the combination (AV) condition where the face and vocalization stimuli were presented simultaneously.

Our work is aimed at determining if multimodal prefrontal neurons detect:

Changes in semantic meaning

If prefrontal neurons are sensitive to the semantic congruence in a vocalization and corresponding facial gesture we expect a difference in the neuronal response to a semantically congruent and a semantically incongruent AV pair.

Changes in identity

By mismatching a vocalization made by caller A with the facial gesture from caller B issuing the same call type, we can test the sensitivity of prefrontal neurons for caller identity and subtle acoustic changes which accompany them. We expect that a neuron that is sensitive to such changes will be modulated by the acoustic differences between callers.

Changes in auditory or visual features

If VLPFC neurons are sensitive to acoustic features, then alterations of these features in the vocalization of an AV pair will cause a significant change in neuronal response compared to the congruent audio-visual stimulus. We expect that this will occur in predominantly auditory/multisensory neurons. Similar alterations of the visual stimulus in an AV pair should evoke changes when compared to the response to the congruent AV pair for neurons that are predominantly visual.

Temporal offset

A number of brain regions are sensitive to the temporal coincidence of cross-modal events including the superior colliculus and frontal lobe speech regions in the human brain. We expect that prefrontal neurons will exhibit a significant change to temporally offset stimuli.

Recording Sites of Multisensory Neurons - Single Subject

Recording sites and distribution of the multisensory cells in one subject. a. An enlarged view of the prefrontal cortex indicating the location of the recording cylinder (dashed line). The electrode penetrations were confined to the region indicated by the grid. The locations of significantly responsive cells are plotted on the grid. Unimodal auditory (filled blue circles); unimodal visual neurons (filled yellow diamonds), and multisensory neurons (open red squares). b. An expanded view of the distribution of the unimodal and multisensory cells in the recording grid is shown, indicating that multisensory and unimodal neurons were found in overlapping locations in the ventrolateral prefrontal cortex. ps, Principal sulcus; as, arcuate sulcus; AP, anteroposterior; ML, mediolateral.

« back to all projects