دانلود مقاله ISI انگلیسی شماره 37946
ترجمه فارسی عنوان مقاله

اصلاح N170 توسط ابراز هیجانی مختلف چهره های شماتیک

عنوان انگلیسی
Modification of N170 by different emotional expression of schematic faces
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
37946 2015 7 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Biological Psychology, Volume 76, Issue 3, October 2007, Pages 156–162

ترجمه کلمات کلیدی
بیان عاطفی چهره - پتانسیل های مرتبط با رویداد
کلمات کلیدی انگلیسی
Emotional expression of faces; Event-related potentials; N170; N400
پیش نمایش مقاله
پیش نمایش مقاله  اصلاح N170 توسط ابراز هیجانی مختلف چهره های شماتیک

چکیده انگلیسی

Abstract The N170 is widely regarded as a face sensitive potential, having its maximum at occipito-temporal sites, with right-hemisphere dominance. However, it is debatable whether the N170 is modulated by different emotional expressions of a face. The aim of this study was to analyze the N170 elicited by schematic happy and angry faces when the emotional expression is semantically processed. To investigate the influence of different emotional expressions of schematic faces, we used a Prime-Probe procedure with the N400 effect as an indicator for a semantic processing. Eighteen subjects were presented the German word “happiness” or “anger” followed by happy and angry faces. The word–face pair could be congruent or incongruent in emotional meaning. Subjects were instructed to compare the emotional meaning of the words and faces and to count the congruent trials. Event-related potentials were recorded from 124 sites. Congruent faces elicited a smaller negativity in the N400 time range than incongruent faces, indicating that the facial emotional expression was cognitively processed. The face sensitive N170 was most pronounced at posterior and occipital sites, and N170 amplitudes were larger following the angry as compared to the happy faces. It is concluded that different emotional expressions of schematic faces can modulate the N170.

مقدمه انگلیسی

1. Introduction Recognizing a stimulus as a human face is indicated by a negative deflection in the 120–220 ms range with an average latency of 170 ms, thus labeled N170 (e.g. Bentin et al., 1996). It is most pronounced at posterior-lateral sites with right-hemisphere dominance (Roisson et al., 1999 and Eimer and Holmes, 2002). Because it was not found when non-facial stimuli such as cars or houses were presented as control stimuli, it was concluded that the N170 is a face sensitive potential, indicating a process of structural encoding of faces prior to their recognition (Eimer, 2000 and Sagiv and Bentin, 2001). Studies performed with various kinds of facial stimuli (real, morphed, schematic, and distorted faces) demonstrated that the N170 amplitude is sensitive to certain facial features such as completeness (e.g. Eimer, 2000) or inversion (e.g. Itier and Taylor, 2002). However, results on a possible influence of different emotional expressions on the N170 are still contradictory. A majority of studies found the face sensitive N170 being unaffected by facial emotional expression. Eimer et al. (2003) presented their subjects photographs of faces showing six basic emotions (anger, disgust, fear, happiness, sadness, and surprise). Compared to a neutral face, none of these faces elicited greater N170 amplitudes at T5 and T6, where the N170 was expected to be maximal. Directing attention either to photographs of fearful and neutral faces or to those of houses shown together, Holmes et al. (2003) found the face sensitive N170 at T5 and T6 being enhanced when faces were attended, but neither amplitude nor latency were affected by emotional facial expression. Ashley et al. (2004) presented their subjects happy, fearful, disgusted and neutral faces, also using houses as control stimuli. The N170 was reactive on orientation (upright vs. inverted) but not selective to any specific emotional expression, thus being only sensitive to structural encoding. A similar conclusion was made by Balconi and Lucchiari (2005) who probed the effect of natural and morphed facial stimuli on the N170. A comparison of fearful, happy and sad faces with an emotional neutral expression as control yielded no N170 modulation by facial expression or its emotional content. The authors restrict the N170 as the ERP marker for the structural encoding module proposed by Bruce and Young (1986). Holmes et al. (2005) presented original and degraded photographs of fearful and neutral faces, compared with those of houses. At electrodes P7 and P8, the N170 amplitudes were enhanced for faces relative to houses, irrespective of them being presented as original or degraded, but yielded no systematic difference between fearful and neutral faces. No significant effects of facial emotional expression (scared, sad, angry and happy compared to neutral photographs) on N170 amplitude or latency at T5 or T6 could be observed in children and adults (with Asperger's syndrome and normal controls) by O’Connor et al. (2005). In their discussion, these authors acknowledge the possibility that photographs of “neutral” faces may recruit emotion processing circuitry as well, since they are social stimuli and are also often ambiguous in emotional meaning. This may have counteracted establishing significant N170 differences between neutral and emotional facial expressions, especially when using photographs of real faces as stimuli. One possible explanation for the failure of an emotion effect could be that the N170 is too early to be sensitive to emotion and valence of facial expression. However, this is not very likely since an earlier ERP component, a fronto-central positivity starting between 80 and 150 ms after stimulus onset was found to be larger after fearful compared to neutral faces (Batty and Taylor, 2003, Eimer and Holmes, 2002, Eimer et al., 2003 and Holmes et al., 2005). From an ethological point of view, facial emotional expressions should be processed as early as possible during face recognition, since they may constitute biologically salient stimuli, enabling a more rapid communication than language (Batty and Taylor, 2003). For example, an angry facial expression may sensitize the organism to an environmental threat (Izard, 1979). Such a biological significance has been recently confirmed by Kolassa and Miltner (2006) who presented angry, happy, and neutral faces to spider phobics, social phobics, and controls. Subjects had to identify either the gender of the face or the expressed emotion. If the emotion was in focus, social phobics revealed larger N170 amplitude over right temporo-parietal sites compared to controls and spider phobics when the emotion of an angry face was identified. In fact there is some more empirical evidence for a N170 modulation by emotional expression of real and morphed faces. Campanella et al. (2002) used a priming procedure to investigate the influence of the emotional meaning of faces on the N170. Faces were presented pairwise, with the second face being identical, or morphed, either displaying the same or a different emotional meaning (fearful or happy). No N170 differences were obtained for the first stimulus of the pair, but the N170 following the second face was enlarged in case of displaying a different emotional meaning, showing its maximum at T5 and T6. Differences in the N170 between emotions were also observed by Batty and Taylor (2003). They presented photographs of faces expressing six basic emotions (anger, disgust, fear, happiness, sadness, and surprise) plus neutral expressions, using photographs of cars, planes and butterflies as additional control stimuli. Latencies, seen around 140 ms, were shorter for positive than for negative emotions, and larger N170 amplitudes at P7 and P8 were elicited by fearful compared to surprised or neutral faces. While no lateralization effects were found in these two studies, Stekelenburg and de Gelder (2004), presenting photographs of faces, bodies and shoes, observed a more negative N170 to fearful compared to neutral faces only at P7, but not at P8. Differences in the N170 due to facial emotional expressions were also obtained when using schematic line drawings of faces as stimuli. Eger et al. (2003) used mainly changes in mouth angles and eyebrow tilts to generated positive, negative, or neutral emotional expressions of schematic faces. A non-meaningful arrangement of facial features (“scrambled face”) served as masking stimulus in a dichoptic presentation. Identical stimuli for both eyes were presented in the control condition. Compared to checkerboard stimuli, a global field power measure located a face sensitive negative ERP component equivalent to the face-related N170 more posterior than anterior and further to the right hemisphere. The face with a negative emotional expression yielded larger field strength in the N170 component compared to positive and neutral faces. In an earlier study performed in our laboratory with a small set of electrodes, we found the N170 sensitive to facial emotional expression but not discriminating between different emotions (Boucsein et al., 2001). Schematic faces with angry, happy, and neutral expressions were generated, using changes in mouth angles and eyebrow tilts. A re-arrangement of the facial features as a clock with moving hands, paralleling the movement in the faces, and a “scrambled face” served as control stimuli. Compared to these controls, schematic faces yielded a larger ERP component in the N170 range. When the facial expression changed from neutral to either happy or angry, the N170 amplitude was higher compared to the change from any emotional to a neutral expression. This emotion sensitivity was most pronounced at temporal sites (T5 and T6) with right hemisphere dominance. However, no differences between positive and negative expressions were found. We conducted the present study to find out whether an angry expression of a face can exert an influence on the N170, using a happy face as control. Our hypothesis was that a higher N170 amplitude, indicating an increased processing, should be elicited by an angry face, because of its biological salience as possibly threatening stimulus, compared to a happy face that needs no such early processing of its emotional content. Since previous studies did not ensure the processing of the emotional content of the facial expression, the different amount of processing the emotional meaning of faces may have contributed to the contradictory results. To ensure that emotional meanings of the faces were cognitively processed, we introduced emotion words as priming stimuli for facial emotional expressions. Words and expressed emotions could be congruent or incongruent. We used the N400 effect as a manipulation check. The N400 effect was first described as an indicator for semantic processing of incongruent words (Kutas and Hillyard, 1980). It is calculated as an ERP difference potential between incongruent and congruent stimuli in the 200–500 ms range, which becomes more negative if a stimulus disturbs a previously primed semantic context compared to a stimulus which does not disturb the context. The N400 effect was used by Münte et al. (1998) for analyzing the cognitive processing of facial stimuli. They compared ERPs elicited by identity mismatch with those elicited by emotional expression mismatch. Two different groups of subjects viewed pairwise presented black and white photographs of faces and performed either an expression or an identity task. In the expression task, an angry, smiling or surprised face was presented first, followed by an identical or different expression. The subjects’ task was to discriminate between the two conditions. The comparison between identity and expression tasks revealed differences in the latency of the N400, with the negativity related to identity mismatch occurring earlier (200–400 ms) than the negativity related to expression mismatch (400–500 ms). The spatial distribution of the N400 also differed between the two tasks, revealing a maximum at fronto-central sites in the identity-matching task, and a centroparietal maximum in the expression-matching task. Bobes et al. (2000) also used the N400 effect to examine expression and identity processing. External features were removed from black and white photographs of faces, which were presented pairwise, with identity and emotion being either congruent or incongruent. As in the study of Münte et al. (1998), the authors reported a significant congruency effect, indicating that the negativity in the relevant time window was more pronounced in case of a mismatch. Another important result of the Bobes et al. (2000) study was that the N400 elicited in the identity matching task was more pronounced at central and parietal sites, displaced towards the left hemisphere, while in the expression matching task, the N400 was more pronounced at parietal and occipital sites, displaced towards the right hemisphere. In our view, the N400 effect observed by Münte et al. (1998) and Bobes et al. (2000) does not necessarily indicate cognitive processing of the faces’ emotional content. As Olivares et al. (1994) demonstrated by comparing complete with incomplete faces, an incongruency-based N400 could be elicited by deviant low-level physical features such as eye–eyebrow fragments, without awareness of any semantic knowledge of the face. Therefore, we did not ask our subjects to compare faces with faces. We increased the probability that participants attended the semantic meaning of the angry and happy faces by comparing their expressions with the meaning of the emotion words. We expected a more pronounced negativity in the N400 range in case of incongruency between the expression of a schematic face and the emotional meaning of the previously presented word compared to congruent word–face pairs, resulting in a negative difference potential when subtracting the congruent from the incongruent condition. To facilitate semantic processing of the faces’ emotional meaning, we instructed our subjects to count the congruent word–face pairs. In case of observing a N400 effect, any observable difference in the N170 range between happy and angry faces would indicate that face processing in this early stage might already be modified by the facial emotional expression. We expected that the N170 amplitude would be higher for an angry as compared to a happy face, because of its greater biological salience.

نتیجه گیری انگلیسی

Results 3.1. N400 effect In case of incongruency between the expression of a schematic face and the emotional meaning of the previously presented word, a negative difference potential, differing significantly from zero, was found in the N400 range (Table 1). It occurred all over the scalp with latencies between 274 and 350 ms. Table 1. Significances for the N400 effect Area F [1/17] p Prefrontal 107.183 <0.0001 Frontal 123.435 <0.0001 Central 88.407 <0.0001 Centroparietal 75.188 <0.0001 Parietal 82.835 <0.0001 Occipital 86.758 <0.0001 F-values and probabilities for testing the mean amplitude of the N400 difference potential against zero for each area. Table options A more pronounced difference potential was found in anterior areas for midline electrodes, with a highly significant main effect for the factor “electrode” (F(7, 119) = 13.965; p ≤ .001). The highest amplitude was observed at electrode Fz; the smallest appeared at electrode Oz. This effect is displayed in Fig. 2 separately for both emotional expressions, although there was no significant effect of the factor “emotion” (F(1, 17) = 1.544; p = .224), to demonstrate that the effect was present for both emotional expressions. No additional significant main effects or interactions emerged. No significant effects were found for N400 latencies. Mean amplitudes of difference waves obtained by subtracting congruent conditions ... Fig. 2. Mean amplitudes of difference waves obtained by subtracting congruent conditions from their related incongruent conditions at midline electrodes, displayed for both emotions. Figure options 3.2. Effects on the N170 We found a negative component in the N170 range at electrodes P1/P2, P1a/P2a, P3/P4, P5/P6, P5a/P6a, P7/P8, P7a/P8a, P9/P10, P9a/P10a, PO3/PO4 and PO7/PO8 with a significant main effect for the factor “hemisphere” (F(1, 17) = 5.709; p < .05) in terms of higher amplitudes at right hemispheric sites. 3.3. Effects of emotion In particular, there was a significant main effect for the factor “emotion” (F(1, 17) = 6.473; p < .05) with higher amplitudes after the presentation of angry compared to happy faces ( Fig. 2). This effect was found at all sites. With respect to a priming effect in terms of differences in amplitude between congruency and incongruency condition, no differences could be observed within each emotion. There were neither significant main effects nor significant interaction effects in latency for the N170. Fig. 3 depicts the mean amplitude of the N170 for all electrodes. Fig. 4 depicts the Grand averages of the N170 at electrodes P5a/P6a, elicited by the presentation of different emotional schematic faces. Mean amplitudes of the N170 for all electrodes. Fig. 3. Mean amplitudes of the N170 for all electrodes. Figure options Grand averages of the N170 at electrodes P5a/P6a, elicited by the presentation ... Fig. 4. Grand averages of the N170 at electrodes P5a/P6a, elicited by the presentation of schematic faces. On the right side the N170 is displayed for the emotion “anger”, the N170 for the emotion “happiness” is displayed on the left side.