پردازش صورت گزینشی و اثر ابراز هیجان خوشایند و ناخوشایند بر روی ارتباط ERP
|کد مقاله||سال انتشار||مقاله انگلیسی||ترجمه فارسی||تعداد کلمات|
|37931||2003||8 صفحه PDF||سفارش دهید||3787 کلمه|
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : International Journal of Psychophysiology, Volume 49, Issue 1, July 2003, Pages 67–74
Abstract Previous studies have revealed that decoding of facial-expressions starts very early in the brain (≈180 ms post-stimulus) and might be processed separately from the basic stage of face perception. In order to explore brain potentials (ERPs) related to decoding of facial-expressions and the effect of emotional valence of the stimulus, we analyzed 18 normal subjects. Faces with five basic emotional expressions (fear, anger, surprise, happiness, sadness) and neutral stimulus were presented in random order. The results demonstrated that an emotional face elicited a negative peak at approximately 230 ms (N230), distributed mainly over the posterior site for each emotion. The electrophysiological activity observed may represent specific cognitive processing underlying the decoding of emotional facial-expressions. Nevertheless, differences in peak amplitude were observed for high-arousal negative expressions compared with positive (happiness) and low-arousal expressions (sadness). N230 amplitude increased in response to anger, fear and surprise, suggesting that subjects’ ERP variations are affected by experienced emotional intensity, related to arousal and unpleasant value of the stimulus.
Introduction Face is an important social stimulus in human interactions. From face-stimuli we are not only able to derive information concerning the person's likely age, sex and so on, but we are also able to interpret the meaning of their facial-expressions. By analyzing the relative shape or posture of facial features we can categorize a person as looking happy, sad and angry. Neuroimaging studies have demonstrated that the visual presentation of emotionally charged stimuli activates not only emotion-specific brain areas but also areas in the extrastriate cortex more than that of a neutral stimulus (Fredrikson et al., 1995, Linkenkaer Hansen et al., 1998, Marinkovic et al., 2000 and Morrison et al., 1998). This activity of the extrastriate areas is functionally interconnected with the activation of the amygdala, which is crucial in emotional processing (Adolphs et al., 1998, Calder et al., 1996, Davidson, 2001 and LeDoux, 1996). Moreover, recent electroencephalographic studies have supported the hypothesis that the process of facial-expression recognition starts very early in the brain, by approximately 180 ms after stimulus onset, only slightly later than the face-selective activity reported between 120 and 170 ms (Bentin et al., 1996, Boetzel and Grusser, 1989, Linkenkaer Hansen et al., 1998, Maurer et al., 2002 and Streit et al., 2000). The first perceptive stage, in which the subject completes the ‘structural codes’ of face, is thought to be processed separately from complex facial information such as emotional meaning (Lane et al., 1998, Pizzagalli et al., 1999 and Junghöfer et al., 2001). Therefore, in addition to a ‘structural code’, the existence was supposed of an ‘expression code’ implicated in the decoding of emotional facial-expressions (Bruce and Young, 1998, Ellis and Young, 1998 and Young, 1998). Most studies considered face-specific brain potentials but they did not analyze specifically the emotional content of face-stimuli (Eimer, 1998 and Eimer and McCarthy, 1999). Vanderploeg et al. (1987) reported that the visual presentation of emotional facial-expressions elicited more negative amplitudes during 230–400 ms than it did neutrally rated stimuli. Similarly, Marinkovic and Halgren (1998) observed that the presentation of emotional facial-expressions evoked a larger lateral occipito-temporal negativity during 200–400 ms, with a peak at approximately 240 ms, than a neutral face. Another study investigated the influence of facial-expressions and blurred faces on explore brain potential (ERP) measures, without any differences between conditions (emotional vs. blurred faces) at 120 and 170 ms after stimulus onset, but significant differences in amplitude between 180 and 300 ms (Streit et al., 2000). Sato et al. (2001) demonstrated that faces with emotions (both fear and happiness) elicited a larger negative peak at approximately 270 ms than neutral faces over the posterior temporal area, covering a broad range of posterior visual areas. On the contrary, there were no differences between negative and positive emotions. In addition, one recent ERPs study analyzed specifically the effect of facial-expressions of emotions through ERPs (Herrmann et al., 2002). The study compared expressions with different emotional valence (sad, happy and neutral), and, whereas it revealed a similar negative peak variation, it failed to find emotion-specific ERP correlates for the three emotions. From the previous results it appeared that structural information is processed separately from emotional information, but no specific ERP profile characterizes a single emotional expression. Nevertheless, as shown by other studies of impairments of facial-expression recognition, the semantic value of expressions has an effect on stimulus elaboration, with category-specific deficits for recognition of emotional expressions (i.e. fear and not happiness) after brain injury of the amygdala (Adolphs et al., 1994, Davidson, 2001, Scott et al., 1997 and Young et al., 1996). An important and currently questioned issue is, therefore, the possible effect of emotional valence of the stimulus on the ERP correlates. Emotionally expressive faces have been shown to have an influence on a number of processes. Depending on the emotions, they elicit differential effects on sympathetic dermal and cardiovascular reactions in the viewer (Lang et al., 1993), facial EMG (Dimberg, 1997), skin reaction (Esteves et al., 1994), amygdalar activation in functional imaging studies (Morrison et al., 1998 and Whalen et al., 1998), as well as ERPs (Junghöfer et al., 2001 and Morita et al., 2001). The present study was designed to clarify this issue as to whether the face-specific brain potential is modified by the emotional valence of the face-stimuli. Moreover, previous studies considered only a limited number of emotions, usually comparing positive and negative ‘basic’ emotions such as happiness and sadness. We extended the range of emotional expressions, considering two features, the arousal (high vs. low) and the hedonic valence (positive vs. negative) of stimulus (Frijda et al., 1989, Scherer, 1980 and Scherer, 1994). As suggested by the ‘functional model’ (Smith and Lazarus, 1990), we supposed that subjects might be more emotionally involved by an angry expression (high-arousal emotion) than by a sad one (low-arousal), and that they might have a more intense emotional reaction while viewing a negative rather than a positive emotion (Lang et al., 1990, Junghöfer et al., 2001 and Wild et al., 2001). More generally, the ‘functional’ model supposes that each emotional expression represents the subject's response to a particular kind of significant event—a particular kind of harm or benefit—that motivates coping activity (Frijda, 1994). Negative high-arousal emotions (like anger, fear and surprise) are expressions of a situation perceived as threatening and of the subject's inability to face up the event (high-arousal). On the contrary, negative low-arousal emotions (like sadness) represent a negative situation and, at the same time, subject's deactivation of an active response (low-arousal). Finally, positive high-arousal emotions (like happiness) express the effectiveness in managing an external stimulus, and its positive value. For this reason, facial-expressions are an important key to explaining the emotional situation and they can produce different reactions in a viewer. As a whole, the ‘significance’ of emotional expressions for the subject and their low/high threatening power should influence both the physiological (i.e. the response to the stimulus in terms of skin conductance or arousal) and the cognitive level (mental response in terms of evaluation), with interesting reflexes on ERP correlates.
نتیجه گیری انگلیسی
3. Results To analyze the effect of different facial-expressions, the peak amplitude and the latency measurements were entered into separate two-way repeated-measures ANOVA with the stimulus category and electrode sites as repeated-measures factors. For the peak measurement the ANOVA showed a significant main effect for category (F(5, 17)=12.09, P<0.001) but not for electrode sites (F(9, 17)=1.43, P=0.82). An interaction category by electrodes was not observed. The grand average ERPs for each emotion at the frontal (Fz) and posterior (Pz) electrode (more representative) is shown ( Fig. 1 and Fig. 2). Grand-averaged waveforms at Fz electrode site for the six face-expressions. Fig. 1. Grand-averaged waveforms at Fz electrode site for the six face-expressions. Figure options Grand-averaged waveforms at Pz electrode site for the six face-expressions. Fig. 2. Grand-averaged waveforms at Pz electrode site for the six face-expressions. Figure options As shown in the figures, a peak at approximately 230 ms is revealed by the emotional expressions in anterior and posterior scalp sites. Nevertheless, a post hoc analysis (Dunnet methods) indicated that the neutral stimulus showed an ERP profile different from the emotional expressions, with a less negative peak at 238 latency. Moreover, whereas anger, fear and surprise did not differ from each other, happiness and sadness had a more positive peak than the three high-arousal expressions (respectively, for the match happiness/fear F(1, 17)=6.45, P=0.001; happiness/anger F(1, 17)=9.19, P<0.001; happiness/surprise F(1, 17)=5.12, P=0.03; sadness/fear F(1, 17)=6.29, P=0.001; sadness/anger F(1, 17)=10.12, P<0.001; sadness/surprise F(1, 17)=5.67, P=0.04). In order to evaluate more precisely the frontal–posterior distribution of peak variations, a more restricted comparison was realized, considering anterior and posterior location (Fz vs. Pz). The repeated-measures ANOVA showed, in addition to the main effect of category, a significant interaction effect category by localization (F(5, 17)=8.63, P=0.003), with a more posterior-distributed peak for each of the emotional expressions, except neutral stimuli (as shown in Table 1). For latency measure, no significant differences were observed among ERP profiles for the main effect of the category (respectively, the mean latency for fear M=237, anger M=236, surprise M=239, happiness M=230, sadness M=227 and neutral M=236). In so far, the negative variation is temporally similar for six face-expressions.