دانلود مقاله ISI انگلیسی شماره 37671
ترجمه فارسی عنوان مقاله

جنسیت و موقعیت پدر و مادر واکنش قشر بصری به چهره نوزاد را تحت تاثیر قرار میدهد

عنوان انگلیسی
Gender and parental status affect the visual cortical response to infant facial expression
کد مقاله سال انتشار تعداد صفحات مقاله انگلیسی
37671 2015 13 صفحه PDF
منبع

Publisher : Elsevier - Science Direct (الزویر - ساینس دایرکت)

Journal : Neuropsychologia, Volume 44, Issue 14, 2006, Pages 2987–2999

ترجمه کلمات کلیدی
احساسات - آمیگدال - عدم تقارن نیمکره غربی
کلمات کلیدی انگلیسی
ERPs; Emotions; Amygdala; FFA; STS; Hemispheric asymmetry
پیش نمایش مقاله
پیش نمایش مقاله  جنسیت و موقعیت پدر و مادر  واکنش قشر بصری به چهره نوزاد را تحت تاثیر قرار میدهد

چکیده انگلیسی

Abstract This study sought to determine the influence of gender and parental status on the brain potentials elicited by viewing infant facial expressions. We used ERP recording during a judgement task of infant happy/distressed expression to investigate if viewer gender or parental status affects the visual cortical response at various stages of perceptual processing. ERPs were recorded in 38 adults (male/female, parents/non-parents) during processing of infant facial expressions that varied in valence and intensity. All infants were unfamiliar to viewers. The lateral occipital P110 response was much larger in women than in men, regardless of facial expression, thus indicating a gender difference in early visual processing. The occipitotemporal N160 response provided the first evidence of discrimination of expressions of discomfort and distress and demonstrated a significant gender difference within the parent group, thus suggesting a strong interactive influence of genetic predisposition and parental status on the responsivity of visual brain areas. The N245 component exhibited complete coding of the intensity of facial expression, including positive expressions. At this processing stage the cerebral responses of female and male non-parents were significantly smaller than those of parents and insensitive to differences in the intensity of infant suffering. Smaller P300 amplitudes were elicited in mothers versus fathers, especially with infant expressions of suffering. No major group differences were observed in cerebral responses to happy or comfortable expressions. These findings suggest that mere familiarity with infant faces does not explain group differences.

مقدمه انگلیسی

Introduction Newborn babies and infants communicate their needs or physiological states (such as pain) mainly through crying and facial expression. Thus, infant facial expression represents an important means of non-verbal communication between parents and their infants. It is well known that facial expression processing is carried out by means of a network of occipitotemporal regions within the ventral visual stream (e.g., Haxby, Hoffman, & Gobbini, 2000; Kanwisher, McDermott, & Chun, 1997; Puce, Allison, Asgari, Gore, & McCarthy, 1996). However, until now little knowledge was available regarding how biological or cultural factors (such as gender or parental status) and cerebral plasticity may affect these structures’ responsiveness to facial expression. Several recent studies have investigated the brain responses of mothers and fathers to familiar and unfamiliar children's cries. In one interesting study (Purhonen et al., 2001a; Purhonen, Paakkonen, Ypparila, Lehtonen, & Karhu, 2001b), auditory ERPs evoked by emotional and neutral auditory stimuli were recorded in a group of mothers 2–5 days after childbirth and in control women who were not in the state of early motherhood. Infant cries were used as the emotional stimulus, for the experimental women the cry of their own infant and for the control women the cry of an unknown infant. The auditory N1 response to both emotional and neutral stimuli was significantly higher in the recent mothers than in control women, suggesting an overall increase in arousal for women who had given birth in the last few days. However, cry familiarity or emotional valence did not produce any effect. Seifritz et al. (2003) also reported interesting fMRI data on differences in brain activation among men and women (parents or not parents) when listening to familiar and unfamiliar infants crying and laughing. Results showed an effect of both gender and parental status. Infant cries evoked stronger activation in the amygdala and interconnected limbic regions in parents versus non-parents. However, women but not men (irrespective of parental status) showed a deactivation in the anterior cingulate cortex in response to both infant crying and laughing. This gender effect was interpreted as a reflection of women's preference for certain sensory stimuli, in this case infant vocalizations. On the other hand, the parental status effect was interpreted as an indication of neuroplastic changes in the brain as a result of parenting experience, likely to subserve the biological need for parental care. Other recent neuroimaging studies (Bartles & Zeki, 2004; Leibenluft, Gobbini, Harrison, & Hexby, 2004; Nitschke et al., 2004) have recorded brain activation in mothers viewing pictures of their own children (which is thought to elicit so-called “maternal love”). The results show activation of brain areas linked to affect (amygdala) and in particular positive emotion (orbitofrontal cortex and connected regions belonging to the pleasure/reward circuitry such as the periaqueductal gray). A few studies have investigated adult judgements of the emotional valence of expressions of unfamiliar and unrelated infants. However, no study has investigated brain responses evoked by viewing unfamiliar infants, responses that are thought to be instinctual and tied to species preservation. The overall aim of the present study was to investigate the cerebral response (synchronized bioelectrical activity using recording of electroencephalogram [EEG] and ERPs) of adults viewing pictures of infants with emotional facial expressions. We sought to characterize this response in a variety of ways. First, we sought to determine its sensitivity to sexual gender and parental status of the viewer. This information in turn would help to differentiate between a genetic and experience-dependent effect on the responsiveness of visual brain areas to infant facial expression. Any impact of parental status on visual processing during the initial perceptual decoding stages (within the first 100–200 ms) would suggest experience-dependent neuroplasticity, even if biologically regulated (“parental behaviour”, e.g., Ramirez, Bardi, French, & Brent, 2004). On the other hand, an influence of gender would suggest a pre-existing (genetically induced although culturally modulated) difference across individuals. We also predicted that if gender or parental status influenced the neural response to viewing of an unknown infant's face, this would be observed first as a difference in the amplitude (or latency) of the occipitotemporal N1 component of the evoked response potentials (ERPs), but possibly as a difference in earlier visual responses. Previous ERP and magnetoencephalography (MEG) studies provide evidence that the N1 response is associated with structural encoding of faces (Bentin, Deouell, & Soroker, 1999; Halgren, Raij, Marinkovic, Jousmaki, & Hari, 2000; Liu, Higuchi, Marantz, & Kanwisher, 2000; Sagiv & Bentin, 2001; Pizzagalli et al., 2002 and Rossion et al., 2000; Watanabe, Kakigi, Koyama, & Kirino, 1999). The N1 response shows a specific sensitivity to face inversion, suggesting holistic processing of this complex visual pattern (e.g., Rossion et al., 2000; Rossion, Joyce, Cottrell, & Tarr, 2003; Rousselet, Mace, & Fabre-Thorpe, 2004; Sagiv & Bentin, 2001). Functional neuroimaging studies have identified an area of the ventral occipitotemporal cortex around the lateral fusiform gyrus, a possible face perception area, as the possible generator of N1 (e.g., Henson et al., 2003; Itier & Taylor, 2004; Kanwisher et al., 1997). This region seems to respond preferentially to faces (Haxby et al., 1999, Haxby et al., 2000 and Puce et al., 1996) and to be sensitive to face inversion and even viewpoint effects (frontal versus three-quarter view) (Pourtois, Schwartz, Seghier, Lazeyras, & Vuilleumier, 2005b). However, there is not a general agreement on the notion that FFA would be specifically devoted to face processing. For example, Gauthier, Tarr, Anderson, Skudlarski, and Gore (1999) compared fMRI activation during perception of upright versus inverted faces and greebles and found activation of the fusiform face area (FFA) for both stimuli; activation increased with expertise. Therefore, they concluded that FFA is activated during object recognition as a function of expertise with visual stimuli, not just in relation to face processing. Haxby et al. (2000), Haxby, Hoffman, and Gobbini (2002), on the basis of available literature and their own fMRI studies, described the face perception mechanism as a distributed but hierarchically organized network of occipitotemporal regions. The core system consists of the extrastriate visual cortex, mediating the visual analysis of face structure, and the superior temporal sulcus (STS), mediating the analysis of changeable face characteristics such as gaze, expression, and lip movements. The interconnection of STS and the amygdala nucleus may be crucial for the emotional evaluation of face expressions. Overall, face processing seems to elicit a stronger activation in the right hemisphere, as suggested by human (Hsiao, Hsieh, Lin, & Chang, 2005; Pegna, Khateb, Michel, & Landis, 2004; Rossion et al., 2003) and animal studies (Pinsk, DeSimone, Moore, Gross, & Kastner, 2005). Analysis of the emotional content of faces seems to take place concurrently with rather than subsequent to completion of the structural encoding of faces (as predicted by the Bruce and Young (1986). In fact, many ERP and MEG studies have found that affective information modulates the brain's response to human faces as early as 120–150 ms after the stimulus (Batty & Taylor, 2003; Halgren et al., 2000; Pizzagalli, Lehmann, Koenig, Regard, & Pascual-Marqui, 2000; Pizzagalli et al., 2002). Furthermore, neuroimaging studies have shown increased activation of the fusiform gyrus during processing of emotional versus neutral faces (e.g., Ganel, Valyear, Goshen-Gottstein, & Goodale, 2005; Hariri, Bookheimer, & Mazziotta, 2000), which supports the hypothesis that the emotional coding of expression occurs during processing of structural face information. During later processing (230–240 ms) the coding of emotional facial expression becomes more sophisticated, distinguishing not only positive versus negative expressions but specific emotions such as fear, happiness, and disgust, as evidenced by differences in the amplitude of the N230–250 visual response (Batty & Taylor, 2003; Liddell, Williams, Rathjen, Shevrin, & Gorden, 2004; Streit, Wolwer, Brinkmeyer, Ihl, & Gaebel, 2000). Second, we sought to compare the visual processing of four different facial expressions (pleasure, comfort, discomfort, and pain) distinguished by both polarity (positive versus negative) and emotional intensity (strong versus weak). Third, we sought to investigate the presence of hemispheric differences in brain activation according to gender (as suggested by the fMRI study by Lee et al., 2002) and emotion polarity. Neuropsychological models predict that positive, approach-related, emotions would be lateralized to the left hemisphere, and negative, withdrawal-related, emotions would be lateralized to the right hemisphere (see also Canli, Desmond, Zhao, Glover, & Gabrieli, 1998).

نتیجه گیری انگلیسی

. Results 3.1. Behavioural data RTs evoked by facial expressions of strong intensity (pleasure and distress) were significantly faster (628 ms) than those evoked by facial expressions of weaker intensity (discomfort and comfort, 699 ms) (F(1, 36) = 266.21, p < 0.0000). RTs were also significantly affected by emotion polarity (F(1, 36) = 37.66, p < 0.0000): RTs to negative emotional states were much faster (647 ms) than RTs to positive states (681 ms). Response speed markedly differed according to emotion category, with the fastest RTs for expressions of pain and the slowest RTS for expressions of comfort ( Table 2). Table 2. Mean viewer (n = 38) RTs (ms) to the four types of facial expressions. Facial expression Mean S.D. Pleasure 644 61.18 Comfort 717 62.05 Discomfort 681 62.42 Distress 613 61.11 Table options ANOVA also showed a significant interaction between parental status, polarity, and intensity of facial expression (F(1, 36) = 3.91, p < 0.05.). Parents had slower RTs than non-parents for all types of expressions ( Fig. 2). Post hoc comparisons indicated a significant group difference (parents versus non-parents) for the response to pleasure, comfort, and distress, to which parents showed a delay varying between 20 and 30 ms. Parents were especially slower in responding to weakly negative expressions, with a difference of approximately 55 ms. Reaction times (ms) of parents (n=18) and non-parents (n=20) to different facial ... Fig. 2. Reaction times (ms) of parents (n = 18) and non-parents (n = 20) to different facial expressions. Significance refers to group differences. Figure options 3.2. ERP data Fig. 3 shows the grand average ERPs for the four types of facial expressions recorded at anterior and posterior scalp sites in female and male viewers. A considerable effect of gender on sensitivity to emotional states conveyed by infant facial expressions is evident (see also Table 3 for a list of all significant differences involving gender and parental status factors). Grand averages of ERPs recorded at left and right lateral occipital (OL, OR), ... Fig. 3. Grand averages of ERPs recorded at left and right lateral occipital (OL, OR), central (C3, C4), parietal (P3, P4), and prefrontal (PF1, PF2) sites in women (A) (n = 19) and men (B) (n = 19) in response to four types of infant facial expression. Figure options Table 3. Statistically significant effects identified with analysis of variance of each ERP component of interest ANOVA factors by ERP component F-value p-Value P110 Gender 5.51 <0.025 Hemisphere × gender × parenthood 4.63 <0.038 N160 Gender × parenthood 4.82 <0.0351 Gender × hemisphere 5.01 <0.0318 N245 Parenthood × facial expression 4.53 <0.071 P300 Gender × parenthood 7.02 <0.0121 Occipital P3 Gender × parenthood 7.64 <0.0092 Parenthood × facial expression 3.56 <0.0168 Parenthood × facial expression × hemisphere 2.92 <0.0376 Central P3 Gender × parenthood 4.45 <0.0423 These effects involved the factors gender and parenthood (as main effects or an interaction). Table options 3.2.1. P110 ANOVA of P110 amplitude revealed a strong lateralization of this early response to the right visual cortex, as indicated by the significant effect of hemisphere (F(1, 34) = 10.46, p < 0.003). On average, the P110 was larger at the right (10.37 μV) versus left (8.98 μV) lateral occipital site. Overall, it was also much larger in women (11.05 μV, S.E. = 0.83) than in men (8.30 μV, S.E. = 0.83), irrespective of parenthood, as indicated by the significant gender main effect (F(1, 34) = 5.51, p < 0.025) and as shown in the topographic maps in Fig. 4. The significant three-way interaction of hemisphere, gender, and parental status (F(1, 34) = 4.63, p < 0.04) revealed an additional effect of parental status restricted to women and the P1 lateralization. Indeed, mothers were the only group in which the response was bilateral (OL = 10.75 μV [S.E. = 1.3], OR = 10.76 μV [S.E. = 1.4]), whereas, the P110 component was smaller in the left hemisphere versus right hemisphere in non-parent women and men (on average, OL = 8.08, OR = 9.93 μV). Time series of isocolour topographic maps of brain activity recorded in women ... Fig. 4. Time series of isocolour topographic maps of brain activity recorded in women and men (irrespective of infant facial expression) for the P110 component. Scalp potential values were computed between 80 and 120 ms with a step of 10 ms. The P110 response was earlier, larger, and less lateralized in women than in men. Figure options 3.2.2. N160 The amplitude of the N160 component was very sensitive to category of facial expression (F(3, 102) = 4.26, p < 0.0071). Post hoc comparisons showed that the N160 elicited by strongly negative expressions (distress and pain) was significantly larger (−2.46 μV) than that elicited by weakly negative expressions (−1.45 μV), while no significant difference was found between positive emotional states. ANOVA also yielded a significant interaction of gender and parental status (F(1, 34) = 4.82, p < 0.0351). Post hoc comparisons revealed a significant difference in the N160 amplitude between female and male parents (mothers versus fathers: 0.47 and −4.0 μV), whereas, female and male non-parents did not differ in their response to infant faces at this stage of processing. Furthermore, ANOVA revealed a significant interaction of gender and hemisphere (F(1, 34) = 5.01, p < 0.0318). Fig. 5 shows the scalp distribution of voltage recorded in the N160 latency range according to gender, with corresponding ERP waveforms. It is evident that the main gender difference concerns a differential hemispheric lateralization in favour of the right hemisphere in males. Post hoc comparisons also revealed that left hemisphere activation in males (−1.77 μV) and females (−2.21 μV) fully overlaps, whereas, the amplitude of the N160 component recorded for males is significantly larger in the right hemisphere (−3.37 μV) compared to what was observed for females (−0.62 μV). Back view of voltage isocolour topographic maps of brain activity recorded in ... Fig. 5. Back view of voltage isocolour topographic maps of brain activity recorded in women and men (irrespective of infant facial expression). Scalp potential values were computed for the N160 peak (157–160 ms). Figure options 3.2.3. N245 The N2 component, which appears at the latency where initial coding of positive facial expressions occurs, was analyzed with two separate five-way repeated-measure ANOVAs, for central and prefrontal electrode sites. At the central sites, facial expression produced a highly significant effect (F(3, 102) = 6.28, p < 0.0006). Post hoc comparisons showed that the amplitude of the N2 component elicited by strongly positive expressions (pleasure = −2.43 μV) was significantly larger (p < 0.02) than the amplitude elicited by weakly positive (comfort = −3.20 μV) and weakly negative expressions (discomfort = −3.45 μV, p < 0.003). Weakly positive and weakly negative expressions also elicited significantly smaller N2 responses than did strongly negative expressions (−2.27 μV, p < 0.001). Thus, significant differences were observed between expressions of strong versus weak intensity. Simple effects analysis showed an interaction of parenthood and facial expression. The N2 component in parents was strongly sensitive to facial expression (F(3, 48) = 4.53, p < 0.071): their N2 response to weakly negative expressions (i.e., discomfort = −3.48 μV) differed significantly (p < 0.0005) from their N2 response to strongly negative expressions (i.e., pain, distress, and intense suffering = −1.8 μV). N2 responses in non-parents were not significantly modulated by emotional expression ( Fig. 6). Grand averages of the ERPs recorded at left and right central sites (C3, C4) of ... Fig. 6. Grand averages of the ERPs recorded at left and right central sites (C3, C4) of non-parents (upper) and parents (lower) following presentation of infant facial expressions of discomfort (dashed line) and distress (solid line). Figure options At prefrontal sites the N2 component was very sensitive to facial expressions (F(3, 102) = 11.03, p < 0.0000). This component was more negative to expressions of weak (comfort = −1.24 μV, discomfort = −1.55 μV) versus strong intensity (pleasure = −0.13 μV, distress = 0.040 μV). No effect of parenthood or gender was observed for the prefrontal area. 3.2.4. P300 As is evident from the ERP waveforms (Fig. 3A and B), the P300 component was extremely sensitive to facial expression (F(3, 102 = 3.34, p < 0.0000). At this latency each emotional expression elicited a P300 component unique in mean area. Distress elicited the largest response by far (6.36 μV); this response was significantly greater than that elicited by pleasure (5.13 μV). Overall, expressions of strong intensity induced a larger P300 response than did emotions of weaker intensity. Weakly negative emotion (discomfort) elicited a smaller-amplitude P300 component than did pleasure (4.27 μV), and a larger-amplitude P300 component than did comfort (3.29 μV). ANOVA revealed a significant gender by parenthood interaction (F(1, 34) = 7.02, p < 0.0121). As can be observed in Fig. 7, mothers showed a larger P300 component (7.02 μV) than did any other group, and, interestingly, a much larger P300 component than did fathers (3.17 μV). Grand averages of the ERPs recorded at left and right lateral occipital sites ... Fig. 7. Grand averages of the ERPs recorded at left and right lateral occipital sites following presentations of infant facial expressions exhibiting strongly negative emotions, according to viewer group. Smaller P300 amplitudes were recorded in fathers vs. mothers, especially with infant expressions of suffering. Figure options ANOVA revealed a significant interaction between facial expression and electrode. At central sites the P300 component more finely discriminated emotions of strong and weak intensity. The P300 response to distress and pleasure was significantly larger at central (distress = 7.08 μV, pleasure = 5.5 μV) versus occipital sites (distress = 5.6 μV, pleasure = 4.7 μV), whereas, the response to expressions of weaker intensity did not differ between the two scalp sites. There was also a significant interaction of parenthood, facial expression, electrode, and hemisphere (F(3, 102) = 2.92, p < 0.0376). To understand this complex interaction better, and in terms of the significance of the electrode factor, separate four-way repeated measure ANOVAs were performed for P300 values recorded at occipital and central sites, using the same factors as for the global ANOVA with the exception of exclusion of the electrode factor. 3.2.5. Occipital P300 The occipital P300 was greatly affected by facial expression (F(3, 102) = 18.23, p < 0.0000). As with the previous analysis, strongly negative facial expression elicited the largest P300 response (5.64 μV), followed by strongly positive facial expression (4.77 μV), weakly negative facial expression (4.21 μV), and weakly positive or neutral facial expression (3.13 μV). At occipital sites, there was a highly significant interaction of gender and parenthood (F(1, 34) = 7.64, p < 0.0092). Mothers showed a larger P300 response (7.46 μV) than fathers (2.78 μV) and non-parents. Furthermore, a significant interaction between parenthood and facial expression (F(3, 102) = 3.56, p < 0.0168) indicated that the mean P300 response was larger for parents (irrespective of gender) than non-parents for all facial expressions, but particularly for strongly negative facial expressions (parents = 6.90; non-parents = 4.39). Tukey post hoc comparisons showed that while non-parents’ P300 response was of equal amplitude for all emotions (distress = 4.39, discomfort = 4.33 μV, pleasure = 4.10 μV) but for comfort (2.44 μV), parents’ P300 response was consistently larger for expressions of pleasure (5.44 μV) versus expressions of comfort (3.82 μV), and even larger for distressed faces (6.90 μV) (see waveforms of Fig. 8). Grand averages of the ERPs recorded at left and right lateral occipital (OL, ... Fig. 8. Grand averages of the ERPs recorded at left and right lateral occipital (OL, OR), central (C3, C4), parietal (P3, P4), and prefrontal (PF1, PF2) sites in parents (n = 18) (A) and non-parents (n = 20) (B) in response to four types of facial expression. Note that while the ERPs of non-parents indicate a gross discrimination between neutral and emotional expressions, those of parents exhibited a finer discrimination of facial expression and a higher empathic response based on the intensity of emotion displayed by the infants. Figure options The interaction of parenthood, facial expression, and hemisphere (F(3, 102) = 2.92, p < 0.0376) was also significant. There was a difference in right hemispheric involvement for the P300 response to negative emotions in non-parents (distress: left = 4.21μV, right = 4.57μV; discomfort: left = 3.77 μV, right = 4.41 μV), but equal involvement of both hemispheres in the processing of these expressions in parents. 3.2.6. Central P300 The P300 component recorded at central sites was also very sensitive to facial expression (F(3, 102) = 35.33, p < 0.0000). Again, strongly negative facial expression elicited the largest P300 response (7.08 μV), followed by strongly positive facial expression (5.49 μV), weakly negative facial expression (4.33 μV), and finally weakly positive/neutral facial expression (3.45 μV). The central P300 component was also subject to a gender by parental status interaction (F(1, 34) = 4.45, p < 0.0423). Post hoc comparisons showed that mothers demonstrated a larger central P300 response (6.57 μV) compared to all other groups, including fathers (3.57 μV), irrespective of the facial expression. Unlike the occipital P300 response, the central P300 component was strongly right-lateralized (left = 4.87μV; right = 5.31 μV), as confirmed by a significant hemisphere effect (F(1, 34) = 3.93, p < 0.05), thus suggesting a role of the right hemisphere in processing infants’ facial expressions.