دانلود مقاله ISI انگلیسی شماره 37709
ترجمه فارسی عنوان مقاله

تعامل حالات چهره و آشنایی: شواهد ERP

عنوان انگلیسی
Interaction of facial expressions and familiarity: ERP evidence
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
37709 2008 12 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Biological Psychology, Volume 77, Issue 2, February 2008, Pages 138–149

ترجمه کلمات کلیدی
تشخیص چهره - عبارات احساسی
کلمات کلیدی انگلیسی
Face recognition; Emotional expressions
پیش نمایش مقاله
پیش نمایش مقاله  تعامل حالات چهره و آشنایی: شواهد ERP

چکیده انگلیسی

Abstract There is mounting evidence that under some conditions the processing of facial identity and facial emotional expressions may not be independent; however, the nature of this interaction remains to be established. By using event-related brain potentials (ERP) we attempted to localize these interactions within the information processing system. During an expression discrimination task (Experiment 1) categorization was faster for portraits of personally familiar vs. unfamiliar persons displaying happiness. The peak latency of the P300 (trend) and the onset of the stimulus-locked LRP were shorter for familiar than unfamiliar faces. This implies a late perceptual but pre-motoric locus of the facilitating effect of familiarity on expression categorization. In Experiment 2 participants performed familiarity decisions about portraits expressing different emotions. Results revealed an advantage of happiness over disgust specifically for familiar faces. The facilitation was localized in the response selection stage as suggested by a shorter onset of the LRP. Both experiments indicate that familiarity and facial expression may not be independent processes. However, depending on the kind of decision different processing stages may be facilitated for happy familiar faces.

مقدمه انگلیسی

Introduction The recognition of facial expressions appears to be independent of a person's identity or familiarity. Thus, we can recognize facial expressions of both familiar and unfamiliar people and, conversely, we do not need to analyse the expression of a face in order to recognize the person. However, there is some evidence for an interaction between the processing of facial expressions and identity. For example, we sometimes have to look twice to recognize someone familiar if she displays a facial expression never seen before or we may get confused when a seemingly strange person smiles at us. The present study seeks to corroborate the evidence for such interactions between facial expression and facial familiarity and attempts to determine the underlying mechanisms as well as the functional loci of possible interactions in the information processing chain. 1.1. Independence of facial familiarity and facial expression In their functional model of face recognition Bruce and Young (1986) assume the independence of the recognition of facial expressions and facial identity. According to their model familiarity is assessed by face recognition units (FRUs) which are independent of expression analysis. Both processes are assumed to occur in separate, parallel pathways. For example, Ellis et al. (1990) showed that the initial classification of familiar faces according to occupation primed the decision for familiarity in a subsequent familiarity decision task but not in an expression decision task. Bobes et al. (2000) showed different topographical distributions of event-related potentials (ERPs) for a familiarity and an expression matching tasks. This suggests distinct neural subsystems subserving both processes. In addition, a double dissociation of face and expression recognition has been reported by Tranel et al. (1988) who studied three patients with prosopagnosia, the inability to identify faces, who could nevertheless recognize facial expressions. Conversely, Young et al. (1993) reported upon a patient with a selective deficit in processing facial expressions vis a vis intact recognition of facial familiarity. These results suggest independent functions of recognizing facial familiarity and facial expression. However, there is some recent evidence that at some level(s) the processes of expression and identity recognition may in fact interact. 1.2. Interaction of facial familiarity and facial expression In their model of a distributed human neural system for face perception Haxby et al. (2000) identified a core system, subserving the visual analysis of identity as well as changeable aspects of faces, such as facial expression, being located in the occipitotemporal cortex with projections to the fusiform gyrus and superior temporal sulcus. This core system is supplemented by an extended system responsible for related aspects of face perception like directing attention or semantic information processing. According to this model functional interactions of different processes might be possible although they are based on separate brain systems (see Posamentier and Abdi, 2003 for a review). The processing of facial expressions starts as early as 80 ms after stimulus onset (Eger et al., 2003), which is even earlier than the N170 component in the ERP, thought to reflect structural face encoding (Eimer, 2000b). Therefore, it stands to reason that the information extracted from expressive faces may modulate early structural face encoding processes (Bruce and Young, 1986). Indeed, Caharel et al. (2005) and Batty and Taylor (2003) found increased N170 amplitudes for negative when compared to positive or neutral facial expressions. Sprengelmeyer and Jentzsch (2006) reported a modulation of the N170 by the intensity of emotional facial expressions. It is possible that expressive faces may boost attention and arousal via interconnections with the amygdala (Sato et al., 2001) and may also modulate later processing stages. For example, in an fMRI study, Vuilleumier et al. (2002) found increased activation in the amygdala for emotionally expressive faces shown at task-irrelevant locations and independent of spatial attention. In addition, in a gender discrimination task Krolak-Salmon et al. (2001) reported differential ERP activity between 250 and 750 ms in occipital and occipito-temporal areas related to emotional expression. They took this as support for top-down modulations from limbic (including amygdala) and frontal areas influencing extra-striate visual areas. Ganel et al. (2005) found increased activation for expressive faces in the fusiform face area (FFA) even when identity was attended. The FFA is commonly thought to mediate only the processing of identity. The results of Ganel et al. (2005), however, suggest an overlap of the neural networks subserving the processing of both identity and emotional expression. Thus, emotional stimuli may guide focussed attention to the relevant location because the amygdala is part of the attentional system (Eastwood et al., 2001). This in return may speed up the classification of a face as being familiar or not. Schweinberger and Soukup (1998) addressed the interaction of facial expression and identity with the selective attention paradigm of Garner (1976). With a stimulus set of two individual faces (person A versus B) and two expressions (happy versus sad) the authors were able to show an asymmetric influence of facial identity on the discrimination of facial expressions. Facial expressions were easier to discriminate when they were correlated with the task-irrelevant identity of a face, for example when person A was displayed only with a happy expression and person B only with a sad expression, than when there was no such correlation. No effect of irrelevant correlation with facial expression was seen in the identity discrimination task. In a study by Baudouin et al. (2000b) participants had to discriminate neutral from happy facial expressions. It was argued that because expression discrimination is a relatively fast process, an interaction between facial familiarity and the discrimination task would emerge only if this process is made relatively difficult and slow. To this aim, faces were displayed with short rather than long presentation times (15 ms versus 400 ms) or with the mouth concealed rather than visible. Specifically in the hard conditions expression discrimination was facilitated for famous faces when compared to unfamiliar faces. The authors concluded that facial familiarity increases “perceptual fluency”, facilitating recognition of facial expressions under difficult conditions. There is also evidence that facial expressions influence the perception and recognition of familiarity. A smile as compared to a neutral expression may increase the subjective familiarity for both unfamiliar and familiar faces (Baudouin et al., 2000a). Endo et al. (1992) found that the recognition of personally familiar faces was facilitated when they displayed a neutral as compared to happy and angry expressions. In contrast, famous faces were recognized faster with happy expressions. The authors argued that a neutral expression is more frequently seen in personally familiar faces whereas famous faces are more often seen with a happy expression. With faces morphed with respect to familiarity and expression Kaufmann and Schweinberger (2004) showed that famous faces are recognized faster when displaying moderately positive expressions whereas the recognition of unfamiliar faces was unaffected by expression. Together, the studies outlined above suggest an interaction of the perception of facial expressions and facial familiarity in one or the other direction. However, some studies suffer from methodological shortcomings. Schweinberger and Soukup (1998) used a very small stimulus set involving only two different individuals. In the study of Baudouin et al. (2000b) the concealed mouth probably altered the recognition of facial expression and disrupted normal holistic processing. The perceptual variation of a concealed mouth may have affected the two expressions differently. The mouth region may be more important for recognizing happiness than neutral expressions (Calder et al., 2001). In addition, the recognition of familiar people relies more on internal facial features as compared to unfamiliar faces (Ellis et al., 1979). Hence, the interaction of the hard/easy condition and familiarity in the expression discrimination task may have been due to differential effects of the perceptual manipulation on the familiarity and the expression dimension. 1.3. Objectives The first aim of the present study was to provide further evidence for the presence of interactions between facial familiarity and facial expressions. Our second aim was to elucidate the mechanisms underlying such interactions. Here we reexamined these questions within a standard paradigm with improved stimulus material and by means of recording event-related brain potentials. A two-choice RT task was used where participants either discriminated facial expressions or facial familiarity. In the expression discrimination task, facial familiarity was varied independently of expression—that is, half of the presented portraits belonged either to personally familiar or unfamiliar faces. In the familiarity discrimination task the other dimension – facial expression – was varied independently. The stimulus set used in this study consisted of portraits of personally familiar and age- and gender-matched unfamiliar persons, displaying neutral expressions, happiness, or disgust. We used personally familiar faces rather than celebrities from the public domain because their representations in memory should be more consistent and more robust (Caharel et al., 2006 and Tong and Nakayama, 1999). Thus, we expected an enhancement of the hypothesized interaction between facial expression and facial familiarity. Most studies searching for an interaction have used either unfamiliar (Schweinberger and Soukup, 1998), or famous faces (Baudouin et al., 2000b). Different degrees of familiarity could be a reason for inconsistent results. Only Endo et al. (1992) and Caharel et al. (2005) used personally familiar faces. However, the number of stimuli had been small (e.g. only the mother's and the own face in the Caharel study) making an interpretation difficult. With a larger set of personally familiar faces we expected to optimize the chances of finding any interactions in processing facial familiarity and facial expression. Event-related potentials were recorded in order to draw conclusions about the temporal characteristics of the functional processing stage which might be affected by the hypothesized interaction. Several distinct components were used in order to pinpoint the functional locus of these putative interactions. The face-sensitive N170 component of the ERP is associated with the formation of a visual representation of a face-like stimulus. It may reflect the functional process of structural face encoding (Bentin et al., 1996, Eimer, 2000b and Rossion et al., 2000) as conceptualized by Bruce and Young (1986). The P300 component may be related to the perceptual evaluation or classification of task relevant stimuli (Johnson, 1986 and McCarthy and Donchin, 1983). The Lateralized Readiness Potential (LRP; Coles, 1989) reflects the activation of a specific response following more abstract response selection (de Jong et al., 1988). Osman et al. (1995) proposed to separate the information processing from stimulus to overt response into two intervals with the LRP. The first interval from stimulus presentation until the beginning of response activation (LRP) is best calculated in the stimulus-synchronized LRP (S-LRP). It is informative about the time demand of processes taking place until the completion of response selection (Leuthold et al., 1996). The second interval from the onset of the LRP until the overt response is measured in LRPs averaged synchronized to the response (LRP-R) and indicates the time demands of motor processes beyond central response selection (Masaki et al., 2004). In contrast to the assumptions of the functional model of face recognition by Bruce and Young (1986) the main objective of the present study was to find interactions between the recognition of facial expressions and of facial familiarity. The first experiment aimed at effects of task-irrelevant facial familiarity in an expression discrimination task for personally familiar and unfamiliar faces. In the second experiment we investigated effects of task-irrelevant facial expression in a familiarity discrimination task on the same stimulus set. These experiments went beyond previous studies, addressing the question of an interaction between facial identity and expression by using more faces with a clearly defined and high degree of familiarity and by using several functionally distinct ERP components in order to localize any effects.

نتیجه گیری انگلیسی

.