SciELO - Scientific Electronic Library Online

 
vol.14 número1A New Multidimensional Questionnaire of Empathy for Early and Middle Adolescents in SpanishEpilepsy and electroencephalographic abnormalities in patients with diagnosis of idiopathic autism spectrum disorder in Medellín índice de autoresíndice de assuntospesquisa de artigos
Home Pagelista alfabética de periódicos  

Serviços Personalizados

Journal

Artigo

Indicadores

Links relacionados

  • Em processo de indexaçãoCitado por Google
  • Não possue artigos similaresSimilares em SciELO
  • Em processo de indexaçãoSimilares em Google

Compartilhar


International Journal of Psychological Research

versão impressa ISSN 2011-2084

int.j.psychol.res. vol.14 no.1 Medellín jan./jun. 2021  Epub 16-Jun-2021

https://doi.org/10.21500/20112084.5032 

Research articles

Relationship Between Gender and Performance on Emotion Perception Tasks in a Latino Population

Relación entre género y desempeño en tareas de percepción de emociones en una población latina

Alvaro Cavieres1   

Rocío Maldonado1 

Amy Bland2 

Rebecca Elliott3 

1Departamento de Psiquiatría, Universidad de Valparaíso, Chile.

2Department of Psychology, Manchester Metropolitan University, UK.

3Neuroscience and Psychiatry Unit, Division of Neuroscience and Experimental Psychology,University of Manchester, UK.


Abstract.

Basic emotions are universally recognized, although differences across cultures and between genders have been described. We report results in two emotion recognition tasks, in a sample of healthy adults from Chile. Methods: 192 volunteers (mean 31.58 years, s.d. 8.36; 106 women) completed the Emotional Recognition Task, in which they were asked to identify a briefly displayed emotion, and the Emotional Intensity Morphing Task, in which they viewed faces with increasing or decreasing emotional intensity and indicated when they either detected or no longer detected the emotion. Results: All emotions were recognized at above chance levels. The only sex differences present showed men performed better at identifying anger (p = .0485), and responded more slowly to fear (p = .0057), than women. Discussion: These findings are consistent with some, though not all, prior literature on emotion perception. Crucially, we report data on emotional perception in a healthy adult Latino population for the first time, which contributes to emerging literature on cultural differences in affective processing.

Keywords: Facial Expression; Emotions; Sex Difference; Adult

Resumen.

Las emociones básicas son reconocidas universalmente, aunque se han descrito diferencias entre culturas y géneros. Reportamos resultados en dos tareas de reconocimiento de emociones, en una muestra de adultos sanos de Chile. Métodos: 192 voluntarios (31.58 años, d.e. 8.36; 106 mujeres) completaron la Emotional Recognition Task, en la que se pidió identificar una emoción exhibida brevemente, y la Emotional Intensity Morphing Task, en la que vieron caras con aumento o disminución de la intensidad emocional e indicando cuando detectaron o dejaron de detectar la emoción. Resultados: Todas las emociones fueron reconocidas en niveles superiores al azar. Las únicas diferencias por género, estadísticamente significativas, se encontraron en los hombres, identificando mejor el enojo (p = .0485) y reaccionando más lentamente al miedo (p = .0057). Discusión: nuestro estudio, además de confirmar hallazgos previos y discrepar con otros, agrega datos previamente inexistentes sobre la percepción emocional en una población latina adulta saludable.

Palabras Clave: Expresión facial; Emociones; diferencia de sexo; adulto.

1. Introduction

Humans can recognize the emotions of others and adjust their behavior accordingly from their first year of life (Hertenstein & Campos, 2004). Facial expressions play an important role in non-verbal communication, allowing almost immediate transmission of crucial information between individuals (Blair, 2003). Therefore, being able to perceive and recognize these expressions is considered essential for the experience of empathy (Gery et al., 2009), engaging in prosocial behavior (Marsh et al., 2007) and maintaining adequate psychosocial functioning (Niedenthal & Brauer, 2012).

Two main dimensions of the emotional perception of faces have been identified: valence and arousal (Russell, 1980; Schubert, 1999). While valence can be measured on a linear scale with pleasant/positive emotions at one end and unpleasant/negative emotions at the other, arousal represents the degree to which a face brings the observer to a state of heightened alertness (Vesker et al., 2018). While some authors have proposed a more efficient processing of positive emotional expressions (Leppänen & Hietanen, 2004), others report precisely the opposite findings (Vaish et al., 2008). However, most empirical research on the ability to identify the six basic expressions (happiness, anger, disgust, fear, sadness, and surprise) show that happiness is the easiest and fear the hardest to recognize, with the other expressions falling in between (Calvo & Lundqvist, 2008; Palermo & Coltheart, 2004; Recio et al., 2013).

The amygdala enables preferential processing of emotionally salient stimuli. Perception of potentially threatening faces is believed to be facilitated via the amygdala’s influence on cortical sensory processing, allowing fast and automatic behavioural responses and increasing the state of arousal of the individual. However, some studies show that activation of the amygdala can be dependent on attention in some circumstances, leading to the conclusion that the amygdala receives input about the emotional significance of stimuli both from cortical and subcortical pathways (Phelps & LeDoux, 2005).

Many studies have asked participants to identify or discriminate fully formed expressions. However, it is also possible to use morphing procedures to generate images of faces with emotions varying systematically in intensities. In general, emotion recognition performance decreases as the intensity of expressions does (Calder et al., 2000; Hess et al., 1997). Morphed faces can also be used to identify thresholds of intensity, in which an expression is detectable or can no longer be identified (Fiorentini et al., 2012). Arguably, manipulating the intensity of expressions in this way allows for a more ecologically valid evaluation and increases the clinical utility of the measurements (Delicato, 2020).

It is frequently claimed that women are better than men in decoding emotions (Hall et al., 2000; Kret & de Gelder, 2012). Apart from social (Deaux & Major, 1987) and evolutionary explanations (Hampson et al., 2006), there are hypothesized differences between genders in patterns of brain activation associated with face processing (Lee et al., 2005) and reports of differential attention to the eyes (Hall, 2010). Gender differences in facial emotion recognition have been described in healthy individuals, with various studies (Donges et al., 2012; Hall & Matssumoto, 2004) and meta-analysis (Thompson & Voyer, 2014), showing enhanced performance in women compared to men. Women have been reported to have superior ability to decode non-verbal messages and to recognize emotions from facial expressions, including in conditions of limited information stimuli, for example, when facial expressions are displayed for less than a second or with subtle expressions (Hoffmann et al., 2010). Women also have been found to have a significantly lower response latency than men recognizing facial expressions (Wingenbach et al., 2018) However, the literature is not consistent, with some studies failing to observe these gender differences (Andric et al., 2016; Dores et al., 2020; Rahman et al., 2004).

It is widely accepted that the basic emotions defined by Ekman (1972), that is, Happiness, Sadness, Fear, Disgust, Anger, Contempt, and Surprise, are universally recognized (Cordaro et al., 2019; Elfenbein & Ambady, 2002), although people tend to be better at recognizing expressions in faces from their own race compared to those of members of other races (Yan et al., 2016). However, there may be important differences in emotion recognition across cultures. For instance, women from France, but not from Brazil, performed better than men in a recognition task (de Souza et al., 2018), while Tu et al. concluded that East Asians perceive a different dimensionality of emotions than Western-based definitions (2018), and Mishra et al., comparing a Dutch and an Indian sample, found subtle but significant differences in ratings of emotional parameters (2018). Combining behavioral and computational analyses of eyes movements, Jack et al. question the universality of human facial expressions of emotion, by showing that Eastern observers persistently fixate on the eye region while Westerners distribute their fixations evenly across the face (2009).

Here we report the results of the comparison between genders in two emotion recognition tasks in a sample of healthy adults from Valparaíso, Chile: we examined their performance in the Emotional Recognition Task (ERT) and the Emotional Intensity Morphing Task (EIMT), both extracted from the EMOTICOM Neuropsychological Test Battery (Bland et al., 2016). This allows for comparison of performance using fully formed expressions and with varying intensity of emotions using validated tasks with excellent reliability (Bland et al., 2016).

The effect of gender on emotion processing seems to be culture-specific and requires further study. Based on published results on other populations, we expect to find differences reflecting better emotion perception in women over men in our sample of Latino people. Finally, we compared our results with those obtained in the validation study of the tasks, conducted in the UK (Bland et al., 2016).

2. Methods

The Ethics Committee of the Universidad de Valparaíso, Chile, approved the study protocol (028/2017). All participants provided written informed consent after the study procedures were explained.

2.1 Participants

Participants were healthy volunteers of both sexes, between 18 and 50 years. 192 volunteers were included (mean 31.58 years; s.d. 8.36, 106 women). Exclusion criteria included a history of psychiatric disorders, significant somatic illness, brain trauma, use of psychotropic medication, significant lifetime history of drug abuse. Absence of current psychiatric disorder was confirmed with the Mini-International Neuropsychiatric Interview (MINI; Sheehan et al., 1998) and the Brief Symptom Inventory (BSI; Derogatis, 1983). Participants were reimbursed for their time.

2.2 Design

Eligible participants were invited to attend a one-hour appointment. Participants completed the two tasks programmed in PsychoPy (Peirce, 2007) on a touchscreen laptop (Dell Inspiron 11). The tasks were administered in a quiet testing room over 20 min.

2.3 Tasks

2.3.1 Emotional Recognition Task (E.R.T.) Full faces version

Emotional faces (happiness, sadness, anger, fear) were briefly presented on the screen (250 ms). Participants were asked to identify the emotion. The facial stimuli were composite images generated from 20 individual white male and female faces of Caucasian origin, showing a facial expression for each emotion. These images were used as endpoints to generate a linear morph sequence that consists of images that change incrementally from 0 (ambiguous) to 10 (full emotional expression). In each trial of the task, a face is displayed on the screen for 250 ms, followed by a screen with 4 buttons presented where participants can choose from happy, sad, anger or fear. Participants touch the button on the screen to indicate their answer. They have up to 10 seconds to do this before the responses are removed from the screen. Each intensity is displayed twice across four emotions; therefore, totalling 80 trials. Time to administer: 12 min Outcome Measures: Accuracy scores and reaction times were calculated for each facial emotion (happiness, sadness, anger, and fear). We also calculated an Affective bias score by subtracting accuracy for sad faces from accuracy from happy faces.

2.3.2 Emotional Intensity Morphing Task

This task assesses the point of emotional intensity in which participants can recognize a facial emotion. Participants view faces that either increase or decrease in emotional intensity and are instructed to respond when they either (a) detect the presence of emotion or (b) no longer detect the presence of emotion. The emotion that they were detecting was made explicit to participants. The task includes five different emotions: happiness, sadness, anger, fear, and disgust. There are 15 levels of intensity ranging from 1 (neutral) to 15 (maximum intensity), each one displayed for 500 ms. Half of the trials are female and half are male; all of Caucasian origin. The faces were displayed in series of ten emotions of increasing intensity (starting in neutral) followed by ten of decreasing intensity (ending in neutral). Each emotion was displayed four times, completing four series (40 faces in total). The order of the emotions was random and separated from each other by a reminder of the instruction (3 s). Time to administer: 5 min Outcome Measures: The point of detection was calculated by taking the level of intensity in the facial expression needed to detect (increasing) or no longer detect (decreasing) each emotion.

2.4 Analysis

All analyses were carried out using the Stata 15 software. A significance level of 95% was considered (α = .05). Normality was verified with the Shapiro-Wilk test. Results of accuracy rates, reaction time, and detection points are described as means and standard deviation or median and interquartile range depending on whether the data were normally distributed or not. Statistical significance of differences between means for accuracy rates, reaction time and detection points between all emotions was determined with the Wilcoxon test.

For both tasks, associations between scores, with normal distribution, and gender of responders, were evaluated using T Student; otherwise, the Mann Whitney Test was used, while correlations between scores and age (included as a covariable) were analyzed with Spearman test.

3. Results

Emotional Recognition Task (ERT). Emotion recognition and reaction time by participants did not follow a normal distribution. Accuracy rates for emotions, in descending order, were as follows: happiness had a median of 90 % (IQR 80-95); fear a median of 80% (IQR 75-85); sadness a median of 70% (IQR 60-80); and anger a mean of 57.5% (SD 13.99). Reaction time had a median of 1.27 s (IQR: 1.04-1.55) for happiness; 1.60 s (IQR: 1.41-1.92) for fear; 1.77 s (IQR: 1.52-2.05) for sad-ness; and 1.76 s (IQ: 1.46-2.12) for anger. The comparison of means in Accuracy rates and Reaction Times shows clear differences between emotions with happiness being the most easily recognized (Table 1). Statistically significant differences between genders were only found in the correct answers for anger (p = .0485), were men performed better than women, and in reaction time to sadness (p = .0057) were women responded faster (Table 2). A negative correlation was found between age and recognition of anger (Rho= .175, p =.015), and fear (Rho= .28, p = .0001), and reaction time for fear (Rho=.2645, p < .001). When conducting this analysis separately by gender, we found that age correlated with the accuracy in the recognition of anger (Rho= .257, p = .008) and fear (Rho= .337, p < .001) only for women, and that the difference in reaction time for fear is valid only in men (Rho=.371, p < .001). The mean affective bias was 19.53 (SD 21.28), with no statistically significant differences by gender (Table 3).

Table 1 Comparison of means in Accuracy rates and Reaction Times for different expressions in the Emotional Recognition Task (ERT) 

Sadness Anger Fear
Acc RT Acc RT Acc RT
Happiness Z: 9.80 Z: 10.43 Z: 11.92 Z: 10.62 Z: 6.46 Z: 9.12
p< .01 p< .001 p< .001 p< .001 p< .001 p< .001
Sadness Z: 6.50 Z: 0.12 Z: 6.91 Z: 4.27
p = .90 p< .001 p< .001 p< .001
Anger Z: 11.21 Z: 3.76
p< .001 p< .001

Note. Wilcoxon test was used in all comparisons. Acc.= Accuracy; RT= Reaction Time

Table 2: Accuracy rates and reaction time in the Emotional Recognition Task (ERT) 

Total Women Men p
Acc Happiness Median:90 Median:90 Median:90 .29641
IQR:80-95 IQR:80-95 IQR:80-95 Z = 1.044
Acc Sadness Median:70 Median:70 Median:70 .82671
IQR:60-80 IQR:60-85 IQR:60-80 Z =-.219
Acc Anger Mean:57.5 Mean:55.7 Mean:59.71 .04852*
SD:13.99 SD:14.61 SD:12.93 t = 1.99, df = 190
Acc Fear Median:80 Median:82.5 Median:80 .62591
IQR:75-85 IQR:75-85 IQR:75-85 Z = .487
RT Happiness Median:1.27 Median:1.25 Median:1.28 .95831
IQR:1.04-1.55 IQR:1.04-1.56 IQR:1.04-1.53 Z = .052
RT Sadness Median:1.77 Median:1.73 Median:1.88 .00571
IQR:1.52-2.05 IQR:1.45-1.96 IQR:1.57-2.1 Z = 1.763
RT Anger Median:1.76 Median:1.73 Median:1.77 .39171
IQR:1.46-2.12 IQR:1.45-2.09 IQR:1.5-2.14 Z = .857
RT Fear Median:1.60 Median:1.57 Median:1.65 .05391
IQR:1.46-1.92 IQR:1.38-1.8 IQR:1.45-2.08 Z = 1.927
Affective Bias Mean:19.53 Mean:18.44 Mean:20.87 .43302
SD:21.28 SD:21.93 SD:20.48 t = .7858, df = 190

Note.1Mann Whitney Test; 2T Test; Acc=Accuracy; RT=Reaction Time; IQR=Interquartile Range SD=standard Deviation; ∗p.05

Emotional Intensity Morphing Task: Data were distributed normally. Mean points of detection for increasing emotional intensity were 5.2 (s.d. 1.09) for disgust; 8.89 (s.d. 2.35) for happiness; 10.09 (s.d. 2.1) for anger; 10.6 (s.d. 2.12) for sadness; and 10.63 (s.d. 2.28) for fear. With decreasing emotional intensity, mean detection points were as follow: 6.07 (s.d. 1.167) for disgust; 11,05 (s.d. 2.03) for sadness; 11.1 (s.d. 2.43) for happiness; 11.43 (s.d. 2. 07) for anger; and 11.48 (s.d. 1.93) for fear (Table 4). Pairwise comparison of the emotional recognition point between all expressions, at increasing and decreasing intensity, is shown in Table 5 and reveals that all emotions were significantly different (Bonferroni corrected p = .0025, all significant p-values were below this limit). There were no statistically significant effects of gender or associations with age for detection points of any of the emotions in the increasing intensity condition. In the case of intensity decrease, there was a positive correlation between age and sadness (p = .007, coef 0.21) and fear (p = .047, coef 0, 15 ). Older participants recognised these emotions at a higher intensity than younger participants.

Table 3 Correlation between accuracy rate, reaction time and age of participants in the Emotional Recognition Task (ERT) 

Total Rho p Male Rho p Female Rho p
Acc. Happiness -.012 .867 .012 .914 -.038 .696
Acc. Sadness -.092 .202 -.075 .492 -.113 .249
Acc. Anger -.175 .015* -.106 .334 -.257 .008
Acc. Fear -.282 <.001* -.217 .044 -.337 <.001*
RT Happiness .035 .628 .055 .613 .021 .828
RT Sadness .042 .568 .130 .234 -.066 .489
RT Anger .026 .720 .123 .260 -.050 .607
RT Fear .265 <.001* .371 <.001* .159 .104
Affective Bias .066 .367 .093 .393 .046 .641

Note. Spearman test was used, Acc=Accuracy; RT=Reaction Time. ∗p.05

Table 4 Average detection point with increasing and decreasing intensity of expression in the Emotional Intensity Morphing Task 

Increasing Intensity Decreasing Intensity
Mean SD Mean SD
Fear 10.63 2.18 11.48 1.93
Sadness 10.60 2.12 11.05 2.03
Disgust 5.20 1.09 6.07 1.10
Anger 10.09 2.10 11.43 2.07
Happiness 8.89 2.35 11.10 2.43

Note. SD= Standar Desviation RIQ= interquartile range

Comparison with validation study: In the Emotional Recognition Task (ERT), the UK sample has significantly greater accuracy for sadness (p < .001) and borderline significantly better at fear (p = .049), with no effects for gender. In the Emotional Intensity Morphing Task, the Chilean sample was slightly less sensitive to fear (p < .04) but much more sensitive to disgust (p < .001).

4. Discussion

Our results did not confirm our hypothesis that Latino women would perform better than men in tasks that require emotion perception. In respect to accuracy, the only significant difference was anger recognition (p = .049), with men performing better than women in the emotion recognition task (ERT). A possible explanation for this finding might be that men are more encouraged to manifest aggressive behavior. Relative to response time, women were faster than men recognizing fear. This heightened sensitivity has been explained both by biological and cultural factors (Campbell et al., 2002).

Affective bias is the tendency to differentially prioritise the processing of negative relative to positive events, and it is commonly observed in clinical and non-clinical populations. However, why such biases develop is not known (Pulcu & Browning, 2017). As expected, our results confirmed this tendency, with values nominally larger than the validation study of the ERT in the UK (Bland et al., 2016), but there were no differences between genders, supporting our general conclusion.

All the emotions used in the tasks were recognized at high percentages. This may be important since women are supposed to have an advantage, specially in more difficult tests. The Emotional Intensity Morphing Task (EIMT) allowed us to test this emotional sensitivity hypothesis, according to which, women should be more sensitive than men to subtle cues of emotional expressions (Fischer et al., 2018). However, in agreement with the findings of more recent publications, our results did not find differences between sexes when the intensity of the emotion increased or decreased (Johnson et al., 2007; Woolley et al., 2015).

Previous reports of women superiority may have been influenced by factors such as the specific emotions, emotional valence, sex and ethnicity of the actor, age of the subjects, among others (Thompson & Voyer, 2014). A critical review of the literature (Forni-Santos & Osório, 2015) concludes that women tend to perform better than men when all emotions are considered as a set. Regarding specific emotions, there seems to be no gender-related differences in the recognition of happiness, whereas results are quite heterogeneous in respect to the remaining emotions, especially sadness, anger, and disgust (Forni-Santos & Osório, 2015).

Although people are expected to be less accurate at recognizing emotions expressed by individuals from a different cultural background, all emotions used in the study were recognized at percentages well above chance levels, confirming the universality of the basic expressions identified by Elkman (Calvo & Lundqvist, 2008; Palermo & Coltheart, 2004; Recio et al., 2013). Also, in line with previous results (Calvo & Nummenmaa, 2009), in our study, happiness was recognised more accurately and quicker than all other emotions, possibly indicating a shorter processing period (Recio et al., 2013), and also suggesting that negative expressions, having more characteristics in common, could be confused with each other (Hoffmann et al., 2010; Montagne et al., 2005). On the other hand, anger was the least recognized of all emotions. In the second task, we used a computermanipulated image, so the subjects observed a dynamic stimulus with an increase or decrease in the intensity of the expressed emotion, improving the ecological validity of the measurements. Again, the results indicate that happiness was the most recognisable emotion, being detected at lower intensity than the other emotions.

Table 5 Comparison of the emotional recognition point for different expressions in the Emotional Intensity Morphing Task (at increasing and decreasing intensity) 

Sadness Disgust Anger Happiness
Incr Decr Incr Decr Incr Decr Incr Decr
Fear t = .24 t = 2.89 t = 33.70 t = 41.6 3.73 t = .359 t = 9.71 t = 2.21
p = .80 p<.001 p<.001 p<.001 p<.001 p = .72 p<.001 p = .03
df = 163 df = 163 df = 163 df = 163 df = 163 df = 163 df = 163 df = 163
Sadness t = 38.06 t = 35.87 t = 3.73 t = 2.70 t = 10.24 t = −.32
p<.001 p<.001 p<.001 p<.001 p<.001 p= .75
df = 163 df = 163 df = 163 df = 163 df = 163 df = 163
Disgust t= 35.12 t= -41.22 t= 23.20 t= -30.14
p<.001 p<.001 p<.001 p<.001
df = 163 df = 163 df = 163 df = 163
Anger t= -8.62 t= 2.19
p<.001 P_.03
df = 163 df = 163

Note. T-test for paired data was used in all comparisons. Incr.=Increasing intensity. Dcr.=Decreasing intensity

Emotions had to be identified in the first task, but they were made explicit in the second. Now, it is still interesting that happiness was the most easily recognized expression, amongst the static pictures and one of the easiest in the decreasing -while not in the increasing- intensity condition. Overall, subjects detected disgust earlier and longer than any other emotion. This may be clinically important since impaired recognition of disgust has been described as one of the first symptoms of neurodegenerative disease (Johnson et al., 2007; Woolley et al., 2015).

It is possible that the ethnicity of the actors could have influenced the results, but the high levels of recognition by the Chilean subjects make this less probable.

Along with general similarities, there are also interesting differences between our results and those from the validation study. Specifically, the UK sample has significantly greater accuracy for sadness, while the Chilean sample was much more sensitive to disgust. It is possible that cultural factors that inhibit or encourage the expression of facial emotions in a given social context may influence their recognition (Engelmann & Pogosyan, 2013).

The study has several limitations. Our original intention was to examine differences between genders in a group of healthy adults, so we limited age inclusion to a rather narrow range (18-50 years). On the other hand, this does not allow for further examination of the unexpected effect that age had on some of the results. Although the tasks we employed have already been used in studies across different populations, we feel that cultural differences may be better explored with faces from the same ethnicity. Also, the use of more spontaneous expressions could increase the ecological validity of the results. Overall, our study, in addition to confirming previous findings and disagreeing with others, adds previously non-existent data on emotional perception in a healthy adult Latino population.

5. Conclusions

Facial emotion recognition is a key tool for establishing successful relationships with others. This ability depends on the proper functioning of the visuospatial system and the ability to unconsciously simulate and imitate the motor aspects involved in the expressions seen. The full acquisition of this ability is not complete before the first decade of life and depends on innate and cultural factors; therefore, the universality of emotion expression and recognition is a controversial issue in relation to both gender and ethnic differences.

Traditionally, Latin-American culture has been associated with a more positive valuation of emotional expression as a form of communication, especially among women. However, our results did not show any superiority of women over men in emotion perception in our sample of a healthy adult Latino population. It is possible that previous reports of gender differences may have been influenced by methodological factors, including cultural differences, so these aspects deserve further study.

References

Andric, S., Maric, N. P., Knezevic, G., Mihaljevic, M., Mirjanic, T., Velthorst, E., & van Os, J. (2016). Neuroticism and facial emotion recognition in healthy adults. Early Intervention in Psychiatry, 10 (2), 160-164. https://doi.org/10.1111/eip.12212. [ Links ]

Blair, R. J. R. (2003). Facial expressions, their communicatory functions and neuro-cognitive substrates. Philosophical Transactions of the Royal Society B: Biological Sciences, 358 (1431), 561-572. https://doi.org/10.1098/rstb.2002.1220. [ Links ]

Bland, A. R., Roiser, J. P., Mehta, M. A., Schei, T., Boland, H., Campbell-Meiklejohn, D. K., Emsley, R. A., Munafo, M. R., Penton-Voak, I. S., Seara-Cardoso, A., Viding, E., Voon, V., Sahakian, B. J., Robbins, T. W., & Elliott, R. (2016). EMOTICOM: A neuropsychological test battery to evaluate emotion, motivation, impulsivity, and social cognition. Frontiers in Behavioral Neuroscience, 10, Article 25. https://doi.org/10.3389/fnbeh.2016.00025. [ Links ]

Calder, A. J., Rowland, D., Young, A. W., Nimmo-Smith, I., Keane, J., & Perrett, D. I. (2000). Caricaturing facial expressions. Cognition, 76 (2), 105-146. https://doi.org/10.1016/S0010-0277(00)00074-3. [ Links ]

Calvo, M. G., & Lundqvist, D. (2008). Facial expressions of emotion (KDEF): Identification under different display-duration conditions. Behavior Research Methods, 40 (1), 109-115. https://doi.org/10.3758/BRM.40.1.109. [ Links ]

Calvo, M. G., & Nummenmaa, L. (2009). Eye-movement assessment of the time course in facial expression recognition: Neurophysiological implications. Cognitive, Affective and Behavioral Neuroscience, 9 (4), 398-411. https://doi.org/10.3758/CABN.9.4.398. [ Links ]

Campbell, R., Elgar, K., Kuntsi, J., Akers, R., Terstegge, J., Coleman, M., & Skuse, D. (2002). The classification of “fear” from faces is associated with face recognition skill in women. Neuropsychologia, 40 (6), 575-584. https://doi.org/10.1016/S0028-3932(01)00164-6. [ Links ]

Cordaro, D. T., Sun, R., Kamble, S., Hodder, N., Monroy, M., Cowen, A., Bai, Y., & Keltner, D. (2019). The Recognition of 18 Facial-Bodily Expressions Across Nine Cultures. Emotion, 20 (7), 1292-1300. https://doi.org/10.1037/emo0000576. [ Links ]

Deaux, K., & Major, B. (1987). Putting Gender Into Context: An Interactive Model of Gender-Related Behavior. Psychological Review, 94 (3), 369-389. https://doi.org/10.1037/0033-295X.94.3.369. [ Links ]

Delicato, L. S. (2020). A robust method for measuring an individual’s sensitivity to facial expressions. Attention, Perception, and Psychophysics, 82 (6),2924-2936. https://doi.org/10.3758/s13414-020-02043-w. [ Links ]

Derogatis, L. R. (1983). The Brief Symptom Inventory: An Introductory Report. Psychological Medicine, 13 (3), 595-605. https://doi.org/10.1017/S0033291700048017. [ Links ]

de Souza, L. C., Bertoux, M., de Faria, Â. R. V., Corgosinho, L. T. S., Prado, A. C. D. A., Barbosa, I. G., Caramelli, P., Colosimo, E., & Teixeira, A. L. (2018). The effects of gender, age, schooling, and cultural background on the identification of facial emotions: A transcultural study. International Psychogeriatrics, 30 (12), 1861-1870. https://doi.org/10.1017/S1041610218000443. [ Links ]

Donges, U., Kersting, A., & Suslow, T. (2012). Women’s greater ability to perceive happy facial emotion automatically: Gender differences in affective priming. PLoS ONE, 7 (7), e41745. https://doi.org/10.1371/journal.pone.0041745. [ Links ]

Dores, A. R., Barbosa, F., Queirós, C., Carvalho, I. P., & Griffiths, M. D. (2020). Recognizing emotions through facial expressions: A largescale experimental study. Int J Environ Res Public Health, 17 (20), Article 7420. https://doi.org/10.3390/ijerph17207420. [ Links ]

Ekman, P. (1972). Universals and Cultural Differences in Facial Expressions of Emotion. In J. Cole (Ed.), Nebraska Symposium on Motivation (Vol. 19) (pp. 207-282). University of Nebraska Press. [ Links ]

Elfenbein, H. A., & Ambady, N. (2002). On the universality and cultural specificity of emotion recognition: A meta-analysis. Psychological Bulletin, 128 (2), 203-235. https://doi.org/10.1037/0033-2909.128.2.203. [ Links ]

Engelmann, J. B., & Pogosyan, M. (2013). Emotion perception across cultures: The role of cognitive mechanisms. Frontiers in Psychology, 4, Article 118. https://doi.org/10.3389/fpsyg.2013.00118. [ Links ]

Fiorentini, C., Schmidt, S., & Viviani, P. (2012). The identification of unfolding facial expressions. Perception, 41 (5), 532-555. https://doi.org/10.1068/p7052. [ Links ]

Fischer, A. H., Kret, M. E., & Broekens, J. (2018). Gender differences in emotion perception and selfreported emotional intelligence: A test of the emotionsensitivityhypothesis. PLoS ONE, 13 (1), e0190712. https://doi.org/10.1371/journal.pone.0190712. [ Links ]

Forni-Santos, L., & Osório, F. L. (2015). Influence of gender in the recognition of basic facial expressions: A critical literature review. World Journal of Psychiatry, 5 (3), 342-351. https://doi.org/10.5498/wjp.v5.i3.342. [ Links ]

Gery, I., Miljkovitch, R., Berthoz, S., & Soussignan, R. (2009). Empathy and recognition of facial expressions of emotion in sex offenders, non-sex offenders and normal controls. Psychiatry Research, 165 (3), 252-262. https://doi.org/10.1016/j.psychres.2007.11.006. [ Links ]

Hall, J. A., Carter, J. D., & Horgan, T. G. (2000). Gender differences in nonverbal communication of emotion. In A. Fischer(Ed.), Gender and Emotion (pp. 97117). Cambridge University Press. https://doi.org/10.1017/cbo9780511628191.006. [ Links ]

Hall, J. A., & Matsumoto, D. (2004). Gender differences in judgments of multiple emotions from facial expressions. Emotion, 4 (2), 201-206. https://doi.org/10.1037/1528-3542.4.2.201. [ Links ]

Hall, J. K., Hutton, S. B., & Morgan, M. J. (2010). Sex differences in scanning faces: Does attention to the eyes explain female superiority in facial expression recognition? Cognition and Emotion, 24 (4), 629-637. https://doi.org/10.1080/02699930902906882. [ Links ]

Hampson, E., van Anders, S. M., & Mullin, L. I. (2006). A female advantage in the recognition of emotional facial expressions: Test of an evolutionary hypothesis. Evolution and Human Behavior, 27 (6), 401-416. https://doi.org/10.1016/j.evolhumbehav.2006.05.002. [ Links ]

Hertenstein, M. J., & Campos, J. J. (2004). The retention effects of an adult’s emotional displays on infant behavior. Child Development, 75 (2), 595-613. https://doi.org/10.1111/j.1467-8624.2004.00695.x. [ Links ]

Hess, U., Blairy, S., & Kleck, R. E. (1997). The intensity of emotional facial expressions and decoding accuracy. Journal of Nonverbal Behavior, 21 (4), 241-257. https://doi.org/10.1023/A:1024952730333. [ Links ]

Hoffmann, H., Kessler, H., Eppel, T., Rukavina, S., & Traue, H. C. (2010). Expression intensity, gender and facial emotion recognition: Women recognize only subtle facial emotions better than men. Acta Psychologica, 135 (3), 278-283. https://doi.org/10.1016/j.actpsy.2010.07.012. [ Links ]

Jack, R. E., Blais, C., Scheepers, C., Schyns, P. G., & Caldara, R. (2009). Cultural Confusions Show that Facial Expressions Are Not Universal. Current Biology, 19 (18), 1543-1548. https://doi.org/10.1016/j.cub.2009.07.051. [ Links ]

Johnson, S. A., Stout, J. C., Solomon, A. C., Langbehn, D. R., Aylward, E. H., Cruce, C. B., Ross, C. A., Nance, M., Kayson, E., Julian-Baros, E., Hayden, M. R., Kieburtz, K., Guttman, M., Oakes, D., Shoulson, I., Beglinger, L., Duff, K., Penziner, E., & Paulsen, J. S. (2007). Beyond disgust: Impaired recognition of negative emotions prior to diagnosis in Huntington’s disease. Brain, 130 (7), 1732-1744. https://doi.org/10.1093/brain/awm107. [ Links ]

Kret, M. E., & de Gelder, B. (2012). A review on sex differences in processing emotional signals. Neuropsychologia, 50 (7), 1211-1221. https://doi.org/10.1016/j.neuropsychologia.2011.12.022. [ Links ]

Lee, T. M. C., Liu, H. L., Chan, C. C. H., Fang, S. Y., & Gao, J. H. (2005). Neural activities associated with emotion recognition observed in men and women. Molecular Psychiatry, 10 (5), 450-455. https://doi.org/10.1038/sj.mp.4001595. [ Links ]

Leppänen, J. M., & Hietanen, J. K. (2004). Positive facial expressions are recognized faster than negative facial expressions, but why? Psychological Research, 69 (1-2), 22-29. https://doi.org/10.1007/s00426-003-0157-2. [ Links ]

Marsh, A. A., Kozak, M. N., & Ambady, N. (2007). Accurate identification of fear facial expressions predicts prosocial behavior. Emotion, 7 (2), 239-251. https://doi.org/10.1037/1528-3542.7.2.239. [ Links ]

Mishra, M. V., Ray, S. B., & Srinivasan, N. (2018). Crosscultural emotion recognition and evaluation of Radboud faces database with an Indian sample. PLoS ONE, 13 (10), e0203959. https://doi.org/10.1371/journal.pone.0203959. [ Links ]

Montagne, B., Kessels, R. P. C., Frigerio, E., de Haan, E. H. F., & Perrett, D. I. (2005). Sex differences in the perception of affective facial expressions: Do men really lack emotional sensitivity? Cognitive Processing, 6 (2), 136-141. https://doi.org/10.1007/s10339-005-0050-6. [ Links ]

Niedenthal, P. M., & Brauer, M. (2012). Social functionality of human emotion. In Annual Review of Psychology, 63, 259-285. https://doi.org/10.1146/annurev.psych.121208.131605. [ Links ]

Palermo, R., & Coltheart, M. (2004). Photographs of facial expression: Accuracy, response times, and ratings of intensity. Behavior Research Methods, Instruments, and Computers, 36 (4), 634-638. https://doi.org/10.3758/BF03206544. [ Links ]

Peirce, J. W. (2007). PsychoPy-Psychophysics software in Python. Journal of Neuroscience Methods, 162 (1-2), 8-13. https://doi.org/10.1016/j.jneumeth.2006.11.017. [ Links ]

Phelps, E. A., & LeDoux, J. E. (2005). Contributions of the amygdala to emotion processing: From animal models to human behavior. Neuron, 48 (2), 175-187. https://doi.org/10.1016/j.neuron.2005.09.025. [ Links ]

Pulcu, E., & Browning, M. (2017). Affective bias as a rational response to the statistics of rewards and punishments. ELife, 6, e27879. https://doi.org/10.7554/eLife.27879. [ Links ]

Rahman, Q., Wilson, G. D., & Abrahams, S. (2004). Sex, sexual orientation, and identification of positive and negative facial affect. Brain and Cognition, 54 (3), 179-185. https://doi.org/10.1016/j.band c.2004.01.002. [ Links ]

Recio, G., Schacht, A., & Sommer, W. (2013). Classification of dynamic facial expressions of emotion presented briefly. Cognition and Emotion, 27 (8), 1486-1494. https://doi.org/10.1080/02699931.2013.794128. [ Links ]

Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39 (6), 1161-1178. https://doi.org/10.1037/h0077714. [ Links ]

Schubert, E. (1999). Measuring emotion continuously: Validity and reliability of the two-dimensional emotion-space. Australian Journal of Psychology, 51 (3), 154-165. https://doi.org/10.1080/00049539908255353. [ Links ]

Sheehan, D. V., Lecrubier, Y., Sheehan, K., Amorim, P., Janavs, J., Weiller, E., Hergueta, T., Baker, R., & Dunbar, G. (1998). The Mini-International Neuropsychiatric Interview (M.I.N.I.): The Development and Validation of a Structured Diagnostic Psychiatric Interview for DSM-IV and ICD-10. The Journalof Clinical Psychiatry, 59 (20), 22-33. [ Links ]

Thompson, A. E., & Voyer, D. (2014). Sex differences in the ability to recognise non-verbal displays of emotion: A meta-analysis. Cognition and Emotion, 28 (7), 1164-1195. https://doi.org/10.1080/02699931.2013.875889. [ Links ]

Tu, Y. Z., Lin, D. W., Suzuki, A., & Goh, J. O. S. (2018). East Asian Young and Older Adult Perceptions of Emotional Faces From an Ageand Sex-Fair East Asian Facial Expression Database. Frontiers in Psychology, 9, Article 2358. https://doi.org/10.3389/fpsyg.2018.02358. [ Links ]

Vaish, A., Grossmann, T., & Woodward, A. (2008). Not All Emotions Are Created Equal: The Negativity Bias in Social-Emotional Development. Psychological Bulletin, 134 (3), 383-403. https://doi.org/10.1037/0033-2909.134.3.383. [ Links ]

Vesker, M., Bahn, D., Degé, F., Kauschke, C., & Schwarzer, G. (2018). Perceiving arousal and valence in facial expressions: Differences between children and adults. European Journal of Developmental Psychology, 15 (4), 411-425. https://doi.org/10.1080/17405629.2017.1287073. [ Links ]

Wingenbach, T. S. H., Ashwin, C., & Brosnan, M. (2018). Sex differences in facial emotion recognition across varying expression intensity levels from videos. PLoS ONE, 13 (1), e0190634. https://doi.org/10.1371/journal.pone.0190634. [ Links ]

Woolley, J. D., Strobl, E. v., Sturm, V. E., Shany-Ur, T., Poorzand, P., Grossman, S., Nguyen, L., Eckart, J. A., Levenson, R. W., Seeley, W. W., Miller, B. L., & Rankin, K. P. (2015). Impaired recognition and regulation of disgust is associated with distinct but partially overlapping patterns of decreased gray matter volume in the ventroanterior insula. Biological Psychiatry, 78 (7), 505-514. https://doi.org/10.1016/j.biopsych.2014.1 2.031. [ Links ]

Yan, X., Andrews, T. J., & Young, A. W. (2016). Cultural similarities and differences in perceiving and recognizing facial expressions of basic emotions. Journal of Experimental Psychology: Human Perception and Performance, 42 (3), 423-440. https://doi.org/10.1037/xhp0000114. [ Links ]

Declaration of data availability: All relevant data are within the article, as well as the information support files.

How to Cite: Covieres, A., Maldonado, R., Bland, A. & Elliott, R. (2021). Relationship Between Gender and Performance on Emotion Perception Tasks in a Latino Population. International Journal of Psychological Research, 14 (1), 106-114. https://doi.org/10.21500/20112084.5032

Received: September 27, 2020; Revised: November 24, 2020; Accepted: February 26, 2021

Corresponding author: Alvaro Cavieres. Email: cavieres.alvaro@gmail.com

Conflict of interests:

The authors have declared that there is no conflict of interest.

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License