دانلود مقاله ISI انگلیسی شماره 37672
ترجمه فارسی عنوان مقاله

پتانسیل وابسته به رخداد و درک شدت در حالات چهره

عنوان انگلیسی
Event related potentials and the perception of intensity in facial expressions
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
37672 2006 8 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Neuropsychologia, Volume 44, Issue 14, 2006, Pages 2899–2906

ترجمه کلمات کلیدی
بیان و صورت - احساسات عمومی - ترس - انزجار - خشم - شدت - نورونهای آینه ای -
کلمات کلیدی انگلیسی
ERP; Facial expression; Basic emotions; Fear; Disgust; Anger; Intensity; N170; Mirror neurons
پیش نمایش مقاله
پیش نمایش مقاله  پتانسیل وابسته به رخداد و درک شدت در حالات چهره

چکیده انگلیسی

Abstract It is well known from everyday experience, that facial expressions of emotions can very much vary in intensity, e.g. ranging from mild anger to rage, or from uneasiness and mild fear to angst and panic. However, the effect of different intensities of facial expressions of emotion on event related potentials has yet not been studied. We therefore investigated 16 healthy participants with a gender decision task to male and female faces displaying angry, disgusted and fearful facial expressions varying in intensity (50%, 100%, 150%). Analysis of ERP data showed a significant increase in amplitude of the N170 by intensity, but not by type of emotion. The intensity induced negative variation was most pronounced between 200 and 600 ms at electrodes P9 and P10. For this time segment, there was a clear linear relationship between intensity and degree of negative deflection. A dipole source localisation of the intensity effect using the difference waveform (150% minus 50% intensity) revealed two symmetrically positioned generators within the inferior temporo-occipital lobe. An emotion specific effect for disgust was further found at temporal electrode sites (FT7 and FT8) at around 350–400 ms. Results are summarised in a two-phase model of emotion recognition, suggesting the existence of an initial monitoring process which codes saliency of incoming facial information. In a second step, the specific emotional content of faces is decoded in emotion specific recognition systems

مقدمه انگلیسی

. Introduction The human face is an important source of social signals. It reveals the individual's identity and expresses, if not controlled intentionally, the inner feelings of our counterparts. The importance of facially transmitted signals in guiding interpersonal behaviour is reflected in the complex functional architecture of psychological processes, which is based on a widely distributed neural network, specifically dedicated to decode these information. One of the most influential models of face processing (Bruce & Young, 1986) suggests an initial structural encoding process, which is followed by separable pathways for processing identity and facial expressions of emotions. Whilst, within this model, identity processing is highly elaborated and fractionated into distinct sub-processes, emotion recognition is represented only as a single and undifferentiated process. Neuropsychological research in the past decade, however, has added substantially to the understanding of the psychological sub-processes as well as the neural substrates underlying facial emotion recognition. Deficits in recognising fearful facial expressions after damage to the amygdala have first been described by Adolphs, Tranel, Damasio, and Damasio (1994). These initial findings have since been replicated by numerous neuropsychological studies investigating people with lesions or functional deficits to the amygdala (Broks et al., 1998, Calder et al., 1996, Meletti et al., 2003, Sato et al., 2002 and Sprengelmeyer et al., 1999). Functional imaging studies could further show, that recognition of fearful faces is based on a spatially distributed neural network, involving superior colliculi, thalamic relay nuclei, striate and extrastriate regions, as well as the amygdala (e.g. Breiter et al., 1996, Fischer et al., 2003 and Morris et al., 1996). Within this network, a fast sub-cortical processing route targeting the amygdala and a slow thalamo-cortical processing route is proposed. The fast processing route forms part of an evolutionary old system which is able to respond rapidly, automatically, and without conscious awareness to signals of threat and danger (LeDoux, 1996). Evidence for the fast route in humans comes from both single case (De Gelder, Vroomen, Pourtois, & Weiskrantz, 1999) and functional imaging studies (Morris et al., 1998; Morris, De Gelder, Weiskrantz, & Dolan, 2001). A different pattern of results comes from studies looking at recognition of facial expressions of emotion in people with pre-clinical as well as clinical Huntington's disease (Gray, Young, Barker, Curtis, & Gibson, 1997; Hennenlotter et al., 2004; Sprengelmeyer et al., 1996; Sprengelmeyer, Schroeder, Young & Epplen, 2006; Sprengelmeyer et al., 1997b; Wang, Hoosain, Yang, Meng, & Wang, 2003). Participants with this disorder were particularly impaired in recognising facial expressions of disgust. Other disorders such as Parkinson's disease (Sprengelmeyer et al., 2003), Tourette's syndrome, Obsessive Compulsive disorder (Sprengelmeyer et al., 1997a), and Wilson's disease (Wang et al., 2003) were also associated with deficits in facial disgust recognition. Furthermore, functional imaging studies (Hennenlotter et al., 2004 and Phillips et al., 1997; Sprengelmeyer, Rausch, Eysel, & Przuntek, 1998) reported the involvement of the basal ganglia and insula in recognising facial expressions of disgust. But in contrast to fear, there is no evidence for a fast processing route for disgust. While the association between amygdala and insular-striatal regions and recognition of fear and disgust is supported by numerous studies, there is only one study linking the nucleus accumbens with recognition of facial expressions of anger (Calder, Keane, Lawrence, & Manes, 2004). However, neuropsychological and functional imaging studies are not able to tell anything about the time course of face processing. To investigate these aspects in detail, various ERP studies have been conducted so far. The most prominent deflection of face related potentials is the N170, first described by Bentin, Allison, Puce, Perez, and McCarthy (1996) and Bötzel, Schulze and Stodieck (1995). Although questioned in the past (Rossion, Curran & Gauthier, 2002; see Bentin & Carmel, 2002 for response), the N170 is now thought to represent the face specific structural encoding process as hypothesised by the Bruce and Young model. Other studies looked particularly at the ERP modulation associated with processing of facial expressions of emotions. Eimer and Holmes (2002) reported a positive fronto-central ERP component within 200 ms after stimulus onset when comparing neutral with fearful facial expressions. Batty and Taylor (2003) investigated the effect of happy, surprised, fearful, sad, disgusted and angry compared to neutral facial expressions on ERPs and found an overall emotion effect on the N170 and emotion specific modulation of ERPs in the 220–450 ms time window at fronto-central sites. An emotion specific N230 at posterior sites to happy, fearful, sad, angry, and surprised compared to neutral faces was reported by Balconi and Pozzoli (2003). Interpretation of these data is straightforward as long as this is done in a static framework of ‘basic emotions’. If done so, the results clearly indicate emotion specific processing of facial expressions as early as 200 ms after stimulus onset. Facial expressions, however, differ not only in respect to the kind of emotion, but also in respect to saliency, that is, how intense a particular emotion is displayed. Given, that ERP responses reflect both kinds of information, the question arises, where and when is this information processed? Existing ERP literature cannot answer this question beyond pure speculation, since intensity in facial expressions has never been controlled for, reported ERP effects therefore could either indicate emotion specific processing, or processing of intensity, or a mixture of both. To address this neglected issue, the present study aims to investigate the effect of different intensities of emotional facial expressions on ERPs. In addition, by using the neuropsychologically well-researched basic emotions fear, disgust, and anger, the study also aims to look for ERP components associated with cognitive processing within emotion specific face recognition systems.

نتیجه گیری انگلیسی

Results 3.1. Behavioural data Only trials with a correct response and with RT between 200 and 2000 ms were included in analyses of RT and error rate. Statistical analyses were performed by means of Huynh-Feldt corrected repeated measures analyses of variance (ANOVA) including the within-subject variables expression (anger, disgust, fear), and intensity (50%, 100%, and 150%). Mean reaction times and error rates are shown in Table 1. A two-way ANOVA including the factors emotion and intensity did not reveal any significant effects, neither in reaction time (Fs < 1.9, ps > 0.17), nor in error rates (Fs < 2.6, ps > 0.06). Table 1. Mean correct reaction time and error rate for the different intensities (50%, 100%, and 150%) and face expressions (anger, disgust, fear) Reaction time (ms) Error rate (%) Anger Disgust Fear Anger Disgust Fear 50% 475 469 475 3.4 2.0 1.8 100% 478 473 481 3.0 2.0 3.2 150% 484 476 477 2.7 3.0 3.5 Table options 3.2. Event-related brain potentials ERP activity was quantified by mean amplitude measures in subsequent time segments: 90–110 ms (P1), 160–180 ms (N170), 200–250 ms, 250–300 ms, 300–350 ms, 350–400 ms, and 400–600 ms. The two early time intervals are chosen around the peaks of the P1 and the N170 component. We decided for 20 ms intervals (instead of the larger intervals for later latencies) because of the shorter duration (higher frequency) of these early components. All signals were averaged separately for experimental conditions and aligned to a 200 ms baseline starting 200 ms before stimulus onset. Statistical analyses were performed on ERP data by means of Huynh-Feldt corrected repeated measures analyses of variance (ANOVA) for each time segment including the factors expression, intensity, and electrode site (64 locations). Note that because the average reference sets the mean activity across all electrodes to zero, any condition effect is only meaningful in interaction with electrode site. Therefore, any condition effect reported here, will be in interaction with electrode site. The results of this analysis are shown in Table 2. A clear effect of intensity was found in the N170 and all subsequent analysis intervals. Post hoc analyses revealed that this effect was largest over parietal-occipital scalp regions. Fig. 2 depicts the average waveforms for the three intensities on P9 and P10 electrodes (top panel). In order to better visualise the intensity effect, two difference waves (100% minus 50%, and 150% minus 50%) were calculated and are shown in Fig. 2 (bottom panel). Fig. 3 (top) shows the distribution of this intensity effect across scalp locations. A spatio-temporal dipole source model using the difference wave (150% minus 50%) was determined in an analysis interval of 200–300 ms relative to stimulus onset using brain electrical source analysis (BESA) software. An initial principal component analysis (PCA) of the activity revealed that a single principal component could account for over 97% of the variance in this time interval. Therefore, one pair of single equivalent dipoles, symmetrical in location was fitted using a four-shell spherical head model. Talairach coordinates of these dipoles were x = 46.3; y = −63.6; z = −6.7, and x = −46.3; y = −63.6; z = −6.7, respectively, which located these dipoles in Brodmann area 19/37. The results are shown in Fig. 3 (bottom). Table 2. F-values and significance levels for the ANOVA of the ERP time segments including the factors electrode site (S), emotion (E), and intensity (I) d.f. F P1 N170 200–250 ms 250–300 ms 300–350 ms 350–400 ms 400–600 ms S 63, 945 10.3*** 42.0*** 6.3*** 8.4*** 13.4*** 23.6*** 39.8*** E × S 126, 1890 1.2 1.2 0.9 1.2 1.1 1.5* 1.1 I × S 126, 1890 1.1 3.1*** 3.9*** 4.9*** 3.3*** 2.4** 2.8*** E × I × S 252, 3780 1.0 0.9 0.9 1.0 1.0 1.1 0.9 * p < 0.05. ** p < 0.01. *** p < 0.001. Table options Top panel: grand mean event-related potentials at electrode sites P9 and P10 ... Fig. 2. Top panel: grand mean event-related potentials at electrode sites P9 and P10 superimposed for the three intensities (50%, 100%, and 150%) of expressed emotion. Bottom panel: ERP difference waveforms showing the effects of intensity on electrode sites P9 and P10. The difference waves for the intensities (100% minus 50%) and (150% minus 50%) are superimposed. Figure options Top panel: voltage maps (110° projection, spherical spline interpolation) ... Fig. 3. Top panel: voltage maps (110° projection, spherical spline interpolation) showing the topographical distribution of the intensity effects across the scalp at different time points relative to the stimulus onset. Bottom panel: dipole source localisation results of the intensity effect using the difference waveform (150% minus 50% intensity). Figure options As shown in Table 2, there was a significant effect of emotion in the time segment 350–400 ms. Post hoc analyses (Bonferroni corrected pairwise comparisons (t-tests)) revealed this effect to be maximal over temporal electrode sites (FT7 and FT8, respectively), due to a larger negativity for disgust as compared to anger and fear (see Fig. 4). Grand mean event-related potentials at electrode sites FT7 and FT8 for disgust, ... Fig. 4. Grand mean event-related potentials at electrode sites FT7 and FT8 for disgust, anger and fear.