پایگاه داده شناخت احساسات و صورت چینی ها (CFERD): یک پارادایم تولید شده کامپیوتری 3-D برای اندازه گیری تشخیص ابراز هیجانی صورت در شدت متفاوت
کد مقاله | سال انتشار | تعداد صفحات مقاله انگلیسی |
---|---|---|
37963 | 2012 | 5 صفحه PDF |
Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)
Journal : Psychiatry Research, Volume 200, Issues 2–3, 30 December 2012, Pages 928–932
چکیده انگلیسی
Abstract The Chinese Facial Emotion Recognition Database (CFERD), a computer-generated three-dimensional (3D) paradigm, was developed to measure the recognition of facial emotional expressions at different intensities. The stimuli consisted of 3D colour photographic images of six basic facial emotional expressions (happiness, sadness, disgust, fear, anger and surprise) and neutral faces of the Chinese. The purpose of the present study is to describe the development and validation of CFERD with nonclinical healthy participants (N=100; 50 men; age ranging between 18 and 50 years), and to generate normative data set. The results showed that the sensitivity index d′ [d′=Z(hit rate)–Z(false alarm rate), where function Z(p), p∈[0,1]], for all emotions was 0.94. The emotion was more readily detected in happiness, and less easily detected in surprise and sadness. In general, this study replicated the previous findings on the recognition accuracy of emotional expression with the Westerner faces. However, our paradigm extends the previous work by including a wider sensitivity range to differentiate subtle perception of emotion intensities. The CFERD will be a useful tool for emotion recognition assessment in affective neurosciences research, especially for the Chinese and cross-cultural studies.
مقدمه انگلیسی
Introduction The recognition of emotions from others' faces is a universal and fundamental skill for social interaction (Ekman, 1992). Facial expressions have been used to investigate the neural substrates for emotional perception and emotional control. Over the past few decades, numerous studies have investigated how the types and strength of emotions are recognised from facial expression stimuli (Adolphs, 2002). In addition, studies have examined why and how the individuals with specific psychiatric or neurological illness (e.g., autistic spectrum disorder, affective disorder, schizophrenia, Huntington's disease or Alzheimer's disease) are impaired in their ability to recognise the facial expressions of emotion (Adolphs, 2002). Many studies have been conducted to illuminate the universality of emotions and the exact characteristics that distinguish basic emotions (Ekmam, 1999). In general, there is evidence to support the fact that six universal basic emotions exist: happiness, anger, sadness, fear, disgust and surprise. These basic emotions have been used in studies investigating the sensitivity and differentiation of emotion perception of facial expressions (Young et al., 2002). In studies involving face perception, 2D static face images are often used. For example, the set of faces from the Pictures of Facial Affect database (Ekman and Friesen, 1976) consists of photographs of five males and six females displaying each of the six basic facial expressions and a neutral expression. Although this database has proved to be an invaluable tool to quantify emotion, there are only a small number of faces included. Other databases available to researchers include the Facial Recognition Technology (FERET) database, Richard's MIT database, the Yale Face Database, Korean Face Database (KFDB), Japanese Female Facial Expression (JAFFE) Database and the Chinese Academy of Sciences—Pose, Expression, Accessory and Lighting (CAS-PEAL) database (Gross, 2005). These databases are often composed of pictures with distinctive facial features, external facial paraphernalia (e.g., hair) and rather gross emotional strength gradation. Although substantial research studies have documented the universality of several emotional expressions, others have shown evidence for cross-cultural differences of facial emotion recognition (Elfenbein and Ambady, 2002 and Freeman et al., 2009). Some attention has been directed towards examining cultural differences (Ducci et al., 1982, Matsumoto, 1989, Mesquita and Frijda, 1992 and Russell, 1994). Although accuracy levels for most groups in the studies were significantly above the levels expected from chance guessing, European and American participants generally scored higher in these studies using American expressions than did Asian or African participants (Ekman, 1972 and Izard, 1971). A more recent study had also demonstrated an overgeneralisation effect that is related to the racial matching between the facial stimuli and the perceivers. White perceivers' stereotyping of Blacks and Asians has been shown to influence this emotional perception (Zebrowitz et al., 2010). In a study investigating the cross-cultural patterns in dynamic ratings of positive and negative natural emotional behaviour (Sneddon et al., 2011), the results indicate that there is substantial agreement across cultures in the valence and patterns of ratings of natural emotional situations but that participants from different cultures show systematic variation in the intensity rating of emotion. Furthermore, there is evidence that native Japanese in Japan and Caucasians in the United States showed greater amygdala activation to fear expressed by members of their own cultural groups (Chiao et al., 2008). Thus, it seems that although emotion recognition is universal across cultures, there is also a subtle cultural variation in recognition accuracy and intensities. Increasing research has been dedicated to delineating the psychophysical relationship between the continua of emotional stimuli and the responses, neural processing and impairments of emotion recognition. For example, the amygdala and some of its functionally connected structures mediate specific neural responses to fearful expressions (Morris et al., 1998 and Whalen et al., 2004), and the early visual processing of emotional faces can be influenced by amygdalar activity (Morris et al., 1998). Besides, it was found that the processing of visual and emotional information were attributable to the perception of eyes and brows (Radua et al., 2010). Furthermore, processing of emotional faces was associated with an increased activation in a number of visual, limbic, temporoparietal and prefrontal areas; the putamen; and the cerebellum; and selective differences between neural networks underlying the basic emotions in limbic and insular brain regions were found (Fusar-Poli et al., 2009). Those studies applied still black and white 2D photographs of facial expressions as experimental stimuli in most instances (Adolphs, 2002 and Elfenbein and Ambady, 2002). Studies have applied stimuli of varying qualities obtained under differing conditions, mostly including posed emotions. The facial orientation of these 2D photographs is fixed or hard to manipulate with different viewpoints (from right to left, as well as up- and downward). Furthermore, most of the study paradigms had participants to identify or to recognise the categories of emotions illustrated in the photographs, instead of more sensitive task of rating their certainty of perception of emotional expressions at different intensities. There is evidence that some facial expressions (e.g., happiness) tend to reach the ceiling level easily and cause low differential power when applying the typical faces with intended emotions (Hess and Blairy, 2001 and Palermo and Coltheart, 2004). In addition, those facial stimuli are typically of a restricted ethnicity and age ranges in which the Chinese faces are relatively sparsely used (e.g., FERET Database, Richard's MIT database and the Yale Face Database). Although the recently developed CAS-PEAL database provides a large-scale face database of Chinese, its expression variation concerning the facial affect is limited to smile, frown and surprised (Gao et al., 2008). Therefore, the need for establishing a computer-generated 3D paradigm based on facial expressions of Chinese is gathering strength. Thus, we developed the Chinese Facial Emotion Recognition Database (CFERD), a colour image database of facial expressions of Chinese people that can be used as a refined measuring tool for the study of affect behaviour and affective neurosciences. Moreover, we applied the computer-generated 3D paradigm to measure the recognition of facial emotional expressions at different intensities. The purpose of the present study, therefore, is to describe the development and validation of CFERD with nonclinical healthy participants, to generate a normative data set.
نتیجه گیری انگلیسی
3. Results A total of 100 healthy nonclinical subjects, including 50 males and 50 females, participated in the validation study. The average age of the participants was 30.8 years (S.D.=6.8), and the average years of education was 15.9 (S.D.=2.1). The correct response rates of each emotion are shown in Fig. 3. Among the seven types of facial expressions, the happy expression was recognised most easily (d.f.=6, F=161.31, P<0.001) ( Table 1). In contrast, subjects performed worse on the recognition of fear and surprise. Fig. 3 also shows the percentage of responses for each facial expression type, indicating the types of errors made by the participants. The most frequent errors for happiness, surprise and sadness were to label them as ‘neutral’. Fear was most frequently mislabelled as ‘surprise’ or ‘sadness’. Anger was most frequently confused with ‘disgust’, while disgust was most frequently confused with ‘anger’. Neutral was most frequently misclassified as ‘happiness’ or ‘sadness’. It is noteworthy that very few faces were incorrectly labelled as ‘fear’. Percentages of responses for each emotion. Fig. 3. Percentages of responses for each emotion. Figure options Table 1. One-way analysis of variance (ANOVA) on mean percent correct scores of facial expressions. Expression N Mean Std. deviation Std. error 95% confidence interval for mean Lower bound Upper bound Neutral 80 0.4350 0.16543 0.01850 0.3982 0.4718 Happiness 80 0.6475 0.09997 0.01118 0.6253 0.6697 Anger 80 0.5250 0.10879 0.01216 0.5008 0.5492 Surprise 80 0.2256 0.10584 0.01183 0.2021 0.2492 Sadness 80 0.3069 0.10180 0.01138 0.2842 0.3295 Fear 80 0.2181 0.09880 0.01105 0.1961 0.2401 Disgust 80 0.2944 0.10524 0.01177 0.2710 0.3178 Total 560 0.3789 0.18868 0.00797 0.3633 0.3946 d.f.=6, F=161.31, p<0.001. Table options Table 2 shows the hit rate, false alarm rate and d′ of each emotion. Overall, the sensitivity index d′ for all emotions was 0.94. The emotion could be more readily detected by happiness, and less easily detected by surprise and sadness. Table 2. Hit rate, false alarm rate, and d′ of each emotion. Expression Hit rate False alarm rate d′ Happiness 0.648 0.108 1.62 Anger 0.525 0.109 1.29 Surprise 0.226 0.079 0.66 Sadness 0.307 0.128 0.63 Fear 0.218 0.067 0.72 Disgust 0.294 0.081 0.86 Neutral 0.433 0.154 0.85 All 0.375 0.104 0.94 d′=Z(hit rate)–Z(false alarm rate), where function Z(p), p∈[0,1]. Table options Correct response rates of each emotion on 10 intensities are shown in Fig. 4. Grossly, the results suggest that recognition accuracy for facial emotions increased as the intensity of the expression increased. Correct response rate of each emotion on 10 intensities. Fig. 4. Correct response rate of each emotion on 10 intensities.