دانلود مقاله ISI انگلیسی شماره 37576
ترجمه فارسی عنوان مقاله

بررسی حافظه تشخیص چهره و ادراک حالت چهره شاد و غمگین: یک مطالعه fMRI

عنوان انگلیسی
Investigation of facial recognition memory and happy and sad facial expression perception: an fMRI study
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
37576 1998 12 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Psychiatry Research: Neuroimaging, Volume 83, Issue 3, 28 September 1998, Pages 127–138

ترجمه کلمات کلیدی
صورت - کاربردی - تصویربرداری عصبی - مثبت - منفی - عاطفه
کلمات کلیدی انگلیسی
Face; Functional; Neuroimaging; Positive; Negative; Emotion
پیش نمایش مقاله
پیش نمایش مقاله  بررسی حافظه تشخیص چهره و ادراک حالت چهره شاد و غمگین: یک مطالعه fMRI

چکیده انگلیسی

Abstract We investigated facial recognition memory (for previously unfamiliar faces) and facial expression perception with functional magnetic resonance imaging (fMRI). Eight healthy, right-handed volunteers participated. For the facial recognition task, subjects made a decision as to the familiarity of each of 50 faces (25 previously viewed; 25 novel). We detected signal increase in the right middle temporal gyrus and left prefrontal cortex during presentation of familiar faces, and in several brain regions, including bilateral posterior cingulate gyri, bilateral insulae and right middle occipital cortex during presentation of unfamiliar faces. Standard facial expressions of emotion were used as stimuli in two further tasks of facial expression perception. In the first task, subjects were presented with alternating happy and neutral faces; in the second task, subjects were presented with alternating sad and neutral faces. During presentation of happy facial expressions, we detected a signal increase predominantly in the left anterior cingulate gyrus, bilateral posterior cingulate gyri, medial frontal cortex and right supramarginal gyrus, brain regions previously implicated in visuospatial and emotion processing tasks. No brain regions showed increased signal intensity during presentation of sad facial expressions. These results provide evidence for a distinction between the neural correlates of facial recognition memory and perception of facial expression but, whilst highlighting the role of limbic structures in perception of happy facial expressions, do not allow the mapping of a distinct neural substrate for perception of sad facial expressions

مقدمه انگلیسی

Introduction Intact processing of faces is critical in social interaction, in view of the enormous amount of information contained in a face (Sergent, 1988). Such processing can be subdivided into various dissociable categories: for example, the recognition of facial expression and the recognition of facial familiarity. Whilst the role of the right hemisphere in facial perception has been highlighted by neuropsychological and functional imaging studies (Etcoff, 1984a and Etcoff, 1984b; Sergent, 1988; Sergent et al., 1992; Puce et al., 1995 and Puce et al., 1996; Kanwisher et al., 1997), there is evidence for the dissociation of facial recognition memory and facial expression perception in terms of the neural substrate underlying these two tasks (George et al., 1993; Sergent et al., 1994). The exact nature of the neural substrate underlying each task remains, however, unclear. Facial familiarity perception and unfamiliar face matching have been linked to the right hemisphere (Young et al., 1993), and facial recognition memory for previously unfamiliar faces has been associated with left hippocampus activity (Kapur et al., 1995). Facial working memory has been linked with the left hemisphere (McIntosh et al., 1996), bilateral occipital (extrastriate) cortex (Courtney et al., 1996 and Courtney et al., 1997), and right prefrontal cortex (Haxby et al., 1996), the latter confirming previous studies linking right prefrontal cortex with episodic memory retrieval (Tulving et al., 1994a and Tulving et al., 1994b; Shallice et al., 1994; Moscovitch et al., 1995; Fletcher et al., 1995; Buckner et al., 1996). There has been much recent interest in the role of the amygdala in perception of fearful facial expressions (Adolphs et al., 1994 and Adolphs et al., 1995; Young et al., 1995; Morris et al., 1996; Breiter et al., 1996; Whalen et al., 1998). Studies of perception of facial expression per se, however, have yielded conflicting results. Lesion studies have implicated both the left hemisphere (Young et al., 1993) and the right hemisphere (Adolphs et al., 1996) in the task, with the latter study demonstrating the role of the right hemisphere in perception of negative emotions, in particular sadness and fear. Functional imaging studies have implicated the right hemisphere (Gur et al., 1994; George et al., 1996), bilateral cingulate cortex (Sergent et al., 1994), and right anterior cingulate and bilateral inferior frontal cortex (George et al., 1993) in recognition of positive and negative facial expressions. Furthermore, bilateral limbic and paralimbic structures have been implicated in induction of sad emotion, with widespread decreases of cortical blood flow during induction of happy emotion (George et al., 1995). More recent studies have demonstrated activation in the amygdala in response to unpleasant emotional stimuli (Lane et al., 1997), and during induction of both happy and sad emotions (Schneider et al., 1997). The nature of the neural substrate underlying happy and sad facial expression perception, and the distinction between this and the neural substrate for facial recognition memory thus remains unclear. In the current study, we used functional magnetic resonance imaging (fMRI) to investigate brain function during the two tasks of facial recognition memory (for previously unfamiliar faces) and facial expression perception, and to investigate more closely the neural correlates of perception of happy and sad facial expression. On the basis of the literature reviewed above, it was hypothesized that: 1. Facial recognition memory and facial expression perception would activate different brain regions; 2. facial recognition memory would activate left hippocampus, in addition to right prefrontal cortex and bilateral occipital cortex; 3. perception of sad facial expression would specifically activate bilateral limbic structures, right hemisphere more than the left. 4. Finally, the existing literature does not permit a clear prediction for the neural substrate underlying perception of happy facial expressions. In light of earlier research (George et al., 1995), we hypothesised that the pattern of activation would be distinct from that for sad facial expression perception.

نتیجه گیری انگلیسی

Results 3.1. Performance Subjects were able to distinguish familiar from unfamiliar faces with a mean accuracy of 61% (S.D.=10.03%). [One subject performed at chance level (50%); the other seven performed better than chance]. Subjects were able to distinguish happy from neutral faces with a mean accuracy of 74% (S.D.=20.0%), and sad from neutral faces with a mean accuracy of 91% (S.D.=14.6%). 3.2. Generic brain activation maps 3.2.1. Facial recognition memory task The main regional foci of generic activation in this task included the right middle temporal gyrus [Brodmann area (BA) 21], bilateral posterior cingulate gyri (BA 30/31), right supramarginal gyrus (BA 40), right middle occipital cortex, left postcentral gyrus (BA 3), right premotor cortex (BA 6), bilateral insulae, left medial prefrontal cortex (BA 9) and left dorsolateral prefrontal cortex (BA 45). Phase analysis revealed that the signal increases in the right middle temporal gyrus, left postcentral gyrus, left medial prefrontal cortex and left dorsolateral prefrontal cortex occurred during presentation of familiar faces; whereas the signal increases in bilateral posterior cingulate gyri, right supramarginal gyrus, right middle occipital cortex, bilateral insulae and right premotor cortex occurred during presentation of unfamiliar faces (Fig. 1A and Table 1). Full-size image (396 K) Fig. 1. Brain activations during (a) the facial recognition task, (b) the happy expression perception tasks and (c) the sad facial expression perception task. (a) top: Generic brain activations in eight right-handed normal subjects during the facial recognition memory task. The grey-scale template was calculated by voxel-by-voxel averaging of the individual EPI images of all subjects, following transformation into Talairach space. Three transverse sections are shown at 1.5 mm below (left), 4 mm above (middle), and 15 mm above (right) the AC–PC line. The right side of the brain is shown on the left of each section; the left side on the right. Voxels have a probability of false activation ≤0.004. Activated voxels with signal maximum during presentation of the familiar faces are coloured red, and are demonstrated in the right middle temporal gyrus (Talairach co-ordinates: x=58, y=−31, z=4; BA 21), left dorsolateral prefrontal cortex (Talairach co-ordinates: x=−32, y=31, z=15; BA 45), and left medial prefrontal cortex (Talairach co-ordinates: x=−6, y=50, z=15; BA 9). Activated voxels with a signal maximum during presentation of unfamiliar faces are coloured blue, and are demonstrated in the right middle occipital cortex (Talairach co-ordinates: x=12, y=−78, z=−1.5; BA 18) and bilateral posterior cingulate gyri (Talairach co-ordinates: x=−6, y=−58, z=15; BA 31; and x=14, y=−50, z=15; BA 30). (b) middle: Generic brain activations in eight right-handed subjects during the happy facial expression perception task. The grey-scale template is as in Fig. 1A. Four transverse sections are shown at 4 mm above (left), 9.5 mm above (middle), 15 mm above (middle) and 20.5 mm above (right) the AC–PC line. The right side of the brain is shown on the left of each section; the left side on the right. Voxels have a probability of false activation ≤0.004. Activated voxels with signal maximum during presentation of faces with a happy expression are coloured red, and are demonstrated in the left anterior cingulate gyrus (Talairach co-ordinates: x=−3, y=44, z=4; BA 24), bilateral medial frontal cortex (Talairach co-ordinates: x=0, y=39, z=9.5; BA 32), right putamen (Talairach co-ordinates: x=23, y=−14, z=9.5), left supramarginal gyrus (Talairach co-ordinates: x=−43, y=−14, z=15; BA 40) and bilateral posterior cingulate gyri (Talairach co-ordinates: x=14, y=−56, z=20.5; BA 30; and x=−14, y=−56, z=15; BA 31). Activated voxels with signal maximum during presentation of faces with a neutral expression are coloured blue, and are demonstrated in the left caudate nucleus (Talairach co-ordinates: x=−20, y=−11, z=20.5). (c) bottom: Generic brain activations in seven right-handed subjects during the sad facial expression perception task. The grey-scale template is as in Fig. 1A. Three transverse sections are shown at 4 mm above (left), 15 mm above (middle), and 20.5 mm above (right) the AC–PC line. The right side of the brain is shown on the left of each section; the left side on the right. Voxels have a probability of false activation ≤0.004. Activated voxels with signal maximum during presentation of faces with a neutral expression are coloured blue, and are demonstrated in the left middle occipital cortex (Talairach co-ordinates: x=−20, y=−89, z=−7 and x=−12, y=−75, z=−7; BA 18), right dorsolateral prefrontal cortex (Talairach co-ordinates: x=46, y=31, z=15; BA 45) and left supramarginal gyrus (Talairach co-ordinates: x=−40, y=−17, z=20.5 and x=−49, y=−33, z=20.5; BA 40). Figure options Table 1. Familiar vs. unfamiliar faces: generically activated brain regions Region Side x a y a z a No. of Pb Condition (approximate voxels of signal Brodmann area) increasec Middle temporal R 58 −31 4 12 0.00001 Familiar gyrus (21) Posterior L −6 −58 15 8 0.00001 Unfamiliar cingulate gyrus R 14 −50 15 2 0.00002 (30/31) Postcentral gyrus L −49 −19 26 8 0.00003 Familiar (3) Supramarginal R 32 −28 31 6 0.00004 Unfamiliar gyrus (40) Middle occipital R 12 −78 −2 6 0.00002 Unfamiliar (extrastriate) cortex (18) Premotor cortex R 35 0 26 3 0.0003 Unfamiliar (6) Insula L −29 −3 20 3 0.00002 Unfamiliar R 35 0 20 3 0.0005 Dorsolateral L −32 31 15 2 0.0005 Familiar prefrontal cortex (45) Medial prefrontal L −6 50 15 2 0.00003 Familiar cortex (9) a Talairach co-ordinates refer to the voxel with the maximum fundamental power quotient (FPQ) in each regional cluster. bAll such voxels were identified by a one-tailed test of the null hypothesis that median FPQ is not determined by experimental design. The probability threshold for activation was P≤0.004. cSignal increase was detected either during presentation of familiar or unfamiliar faces. Table options 3.2.2. Happy facial expression perception task The main regional foci of generic activation in this task included left anterior cingulate gyrus (BA 24), bilateral medial frontal cortex, bilateral posterior cingulate gyri (BA 23/30/31), left supramarginal gyrus (BA 40), right putamen, left caudate nucleus and right dorsolateral prefrontal cortex (BA 46). Phase analysis revealed that the signal increase in all the above brain regions other than the left caudate nucleus was during presentation of happy rather than neutral facial expressions (Fig. 1B and Table 2A). Table 2. Happy vs. neutral facial expressions: generically activated brain regions Region Side x a y a z a No. of Pb Condition (approximate voxels of signal Brodmann area) increasec Anterior cingulate L −3 44 4 14 0.00001 Happy gyrus (24) Posterior R 14 −56 20 13 0.00001 Happy cingulate gyrus 3 −61 15 11 0.00001 (23/30/31) L −14 −56 15 4 0.0001 Supramarginal L −43 −14 15 9 0.00001 Happy gyrus (40) −40 −17 20 4 0.0004 Medial frontal R/L 0 39 9 7 0.00001 Happy cortex (32) Putamen R 23 −14 9 6 0.00005 Happy 26 −19 9 3 0.0004 Caudate nucleus L −20 −11 20 4 0.0003 Neutral B: Sad vs. neutral facial expressions: generically activated brain regions Supramarginal L −40 −17 20 8 0.0001 Neutral gyrus (40) −49 −33 20 4 0.00006 Dorsolateral R 46 31 15 7 0.00007 Neutral prefrontal cortex (45) Middle occipital L −20 −89 −7 5 0.0001 Neutral cortex (18) −12 −75 −7 2 0.0004 −20 −78 −7 2 0.0004 −9 −83 4 2 0.0004 a Talairach co-ordinates refer to the voxel with the maximum FPQ (fundamental power quotient) in each regional cluster. bAll such voxels were identified by a one-tailed test of the null hypothesis that median FPQ is not determined by experimental design. The probability threshold for activation was P≤0.004. cSignal increase was detected either during presentation of sad or neutral facial expressions. Table options An example of a fitted time series obtained in this task is demonstrated for the signal increase in the bilateral posterior cingulate gyri (Fig. 2A). The median values for the standardised amplitudes of sine and cosine waves at ... Fig. 2. The median values for the standardised amplitudes of sine and cosine waves at both fundamental and first harmonic frequencies were computed for each generically activated brain region. Multiplied by the appropriate columns of the design matrix, these parameters defined a fitted time series for each activated brain region (see Section 2). (a): The fitted time series demonstrates the patterns of signal intensity change at the fundamental frequency (broken line) and additional first harmonic modulation (solid line) in the bilateral posterior cingulate gyri (BA 23, 30 and 31) in the happy facial expression perception task. (b): The fitted time series demonstrates the patterns of signal intensity change at the fundamental frequency (broken line) and additional first harmonic modulation (solid line) in the left supramarginal gyrus (BA 40) in the sad facial expression perception task. Figure options 3.2.3. Sad facial expression perception task The main regional foci of generic activation in this task included the left supramarginal gyrus (BA 40), right dorsolateral prefrontal cortex (BA 45) and left middle occipital cortex (BA 18). Phase analysis revealed that the signal increase in all of these brain regions occurred during presentation of neutral rather than sad facial expressions (Fig. 1C and Table 2B). An example of a fitted time series obtained in this task is demonstrated for the signal increase in the supramarginal gyrus (Fig. 2B).