Hostname: page-component-5cf477f64f-fcbfl Total loading time: 0 Render date: 2025-03-25T18:52:02.547Z Has data issue: false hasContentIssue false

Autistic traits, alexithymia, and emotion recognition of human and anime faces

Published online by Cambridge University Press:  20 March 2025

Bridger J. Standiford
Affiliation:
Department of Psychological and Social Sciences, Pennsylvania State University, Abington, PA, USA
Kevin J. Hsu*
Affiliation:
Department of Psychological and Social Sciences, Pennsylvania State University, Abington, PA, USA
*
Corresponding author: Kevin J Hsu; Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Individuals on the autism spectrum or with elevated autistic traits have shown difficulty in recognizing people’s facial emotions. They also tend to gravitate toward anime, a highly visual medium featuring animated characters whose facial emotions may be easier to distinguish. Because autistic traits overlap with alexithymia, or difficulty in identifying and describing feelings, alexithymia might explain the association between elevated autistic traits and difficulty with facial emotion recognition. The present study used a computerized task to first examine whether elevated autistic traits in a community sample of 247 adults were associated with less accurate emotion recognition of human but not anime faces. Results showed that individuals higher in autistic traits performed significantly worse on the human facial emotion recognition task, but no better or worse on the anime version. After controlling for alexithymia and other potentially confounding variables, autistic traits were no longer associated with performance on the facial emotion recognition tasks. However, alexithymia remained a significant predictor and fully mediated the relationship between autistic traits and emotion recognition of both human and anime faces. Findings suggest that interventions designed to help individuals on the autism spectrum with facial emotion recognition might benefit from targeting alexithymia and employing anime characters.

Type
Regular Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
© The Author(s), 2025. Published by Cambridge University Press

Individuals with autism spectrum conditions (ASC) experience clinically significant difficulty in their social or emotional communication, which can include marked impairment in the appropriate use of non-verbal signals (e.g., facial expressions), recognition or sharing of emotions, and responses to the emotions of others (American Psychiatric Association, 2013). Unusually narrow, rigid, or atypical interests are another characteristic of individuals with ASC. Although they describe a neurodevelopmental condition and a psychiatric diagnosis, ASC are the clinical extreme in a broad spectrum of neurodiversity. Indeed, autistic traits are commonly understood to occur along a continuum not only within clinical populations, but also among the general population (Allison et al., Reference Allison, Auyeung and Baron-Cohen2012; Baron-Cohen et al., Reference Baron-Cohen, Jolliffe, Mortimore and Robertson1997; Ruzich et al., Reference Ruzich, Allison, Smith, Watson, Auyeung, Ring and Baron-Cohen2016).

Because emotion recognition is key to effective social interactions, research has focused on understanding the characteristics and limitations of this skill in individuals with ASC, or elevated autistic traits. On the one hand, many studies have demonstrated that individuals with ASC, or elevated autistic traits, have more difficulty interpreting the facial emotions of others than do those without ASC, or those who are low in autistic traits (Actis-Grosso et al., Reference Actis-Grosso, Bossi and Ricciardelli2015; Ashwin et al., Reference Ashwin, Chapman, Colle and Baron-Cohen2006; Celani et al., Reference Celani, Battacchi and Arcidiacono1999; Clark et al., Reference Clark, Winkielman and McIntosh2008; Golan et al., Reference Golan, Baron-Cohen and Hill2006; Hobson et al., Reference Hobson, Ouston and Lee1989; Ingersoll, Reference Ingersoll2010; Macdonald et al., Reference Macdonald, Rutter, Howlin, Rios, Le Conteur, Evered and Folstein1989; McKenzie et al., Reference McKenzie, Murray, Wilkinson, Murray, Metcalfe, O’Donnell and McCarty2018; Philip et al., Reference Philip, Whalley, Stanfield, Sprengelmeyer, Santos, Young, Atkinson, Calder, Johnstone, Lawrie and Hall2010; Poljac et al., Reference Poljac, Poljac and Wagemans2013; Sucksmith et al., Reference Sucksmith, Allison, Baron-Cohen, Chakrabarti and Hoekstra2013; Wallace et al., Reference Wallace, Coleman and Bailey2008). On the other hand, some research has failed to demonstrate a relationship between ASC or autistic traits and facial emotion recognition (Baron-Cohen et al., Reference Baron-Cohen, Jolliffe, Mortimore and Robertson1997; Harms et al., Reference Harms, Martin and Wallace2010; Jones et al., Reference Jones, Pickles, Falcaro, Marsden, Happé, Scott, Sauter, Tregay, Phillips, Baird, Simonoff and Charman2011; Miyahara et al., Reference Miyahara, Bray, Tsujii, Fujita and Sugiyama2007; Piggot et al., Reference Piggot, Kwon, Mobbs, Blasey, Lotspeich, Menon, Bookheimer and Reiss2004; Tracy et al., Reference Tracy, Robins, Schriber and Solomon2011; see Yeung, Reference Yeung2022 for a recent review). Uljarevic and Hamilton (Reference Uljarevic and Hamilton2013) conducted a meta-analysis of 48 studies comparing those formally diagnosed with ASC and those without ASC on their ability to accurately identify the emotions of human faces in behavioral tasks. Although the behavioral tasks used to assess emotion recognition varied from study to study, they most commonly involved asking participants to identify the emotions displayed in a series of faces, or to select the faces matching a series of emotions. After correcting for publication bias, the results of this meta-analysis supported the evidence that individuals with ASC show more difficulty with facial emotion recognition (Hedges’ g = 0.41).

Despite previous research converging on the general finding that ASC and elevated autistic traits are associated with deficits in facial emotion recognition, it remains unclear why only some individuals on the autism spectrum experience difficulty with recognizing facial emotions, and why those individuals developed this difficulty. In their meta-analysis, Uljarevic and Hamilton (Reference Uljarevic and Hamilton2013) found that factors such as age, intelligence (IQ), and type of behavioral task used in a study did not explain the deficits in facial emotion recognition found among individuals with ASC. They suggested instead that a lack of attending to some important features of human faces, such as the eyes, might explain these deficits. Indeed, a different line of research has revealed that individuals with ASC or elevated autistic traits may be less oriented to human faces (Black et al., Reference Black, Chen, Iyer, Lipp, Bölte, Falkmer, Tan and Girdler2017; Dawson et al., Reference Dawson, Webb and McPartland2005; Guillon et al., Reference Guillon, Rogé, Afzali, Baduel, Kruck and Hadjikhani2016; Klin et al., Reference Klin, Jones, Schultz, Volkmar and Cohen2002; Pelphrey et al., Reference Pelphrey, Sasson, Reznick, Paul, Goldman and Piven2002; Vettori et al., Reference Vettori, Dzhelyova, Van der Donck, Jacques, Van Wesemael, Steyaert, Rossion and Boets2020) and find viewing them less rewarding (Chevallier et al., Reference Chevallier, Grèzes, Molesworth, Berthoz and Happé2012; Cuve et al., Reference Cuve, Gao and Fuse2018; Silva et al., Reference Sizoo, Horwitz, Teunisse, Kan, Vissers, Forceville, Van Voorst and Geurts2015; Stuart et al., Reference Stuart, Whitehouse, Palermo, Bothe and Badcock2023; Tottenham et al., Reference Tottenham, Hertzig, Gillespie-Lynch, Gilhooly, Millner and Casey2014).

Autism spectrum, anime, and facial emotion recognition

Atherton and Cross (Reference Atherton and Cross2018) suggested that the reduced saliency of human faces for individuals with ASC or elevated autistic traits, combined with their propensity to develop strong circumscribed interests, results in their tendency to gravitate toward non-human social interests, such as cartoons. In line with this reasoning, Silva et al. (Reference Sizoo, Horwitz, Teunisse, Kan, Vissers, Forceville, Van Voorst and Geurts2015) found that children diagnosed with ASC showed a preference for viewing cartoon faces over human faces and were, in fact, avoidant of human faces. Jones et al. (Reference Jones, Pickles, Falcaro, Marsden, Happé, Scott, Sauter, Tregay, Phillips, Baird, Simonoff and Charman2011) reported survey data indicating that adolescents with ASC spend a large amount of time viewing screen-based media. Mazurek et al. (Reference Mazurek, Shattuck, Wagner and Cooper2012) and Shane and Albert (Reference Shane and Albert2008) similarly noted that parents of children with ASC reported electronic screen engagement, and especially animated television shows and movies, as their child’s most common leisure activity. Finally, individuals with ASC or elevated autistic traits have specifically shown a strong preference for anime (Atherton et al., Reference Atherton, Morimoto, Nakashima and Cross2023), a style of Japanese animation that has rapidly grown in popularity since the introduction of Pokémon in the 1990s (Allison, Reference Allison2006; Napier, Reference Napier2006; Otmazgin, Reference Otmazgin2014). Anime is a specific type of cartoon that has a distinct art style originating in Japan and features vibrant characters with exaggerated facial expressions. Cartoons include anime and a range of other art styles, but the term usually refers to Western animation. One study analyzed the constricted interests that tend to characterize individuals on the autism spectrum, revealing that anime was the most frequently reported topic of interest (South et al., Reference South, Ozonoff and McMahon2005). Another study examining the media use of adolescents with ASC found that websites focused on anime were the second most popular among all the websites that they visited, after websites discussing video games (Kuo et al., Reference Kuo, Orsmond, Coster and Cohn2014).

The well-documented interest in viewing animated media and the specific preference for anime among individuals on the autism spectrum have been attributed to the use of distinct characters in these media, where facial expressions are exaggerated to the point of caricature (Atherton et al., Reference Atherton, Morimoto, Nakashima and Cross2023; Liu et al., Reference Liu, Chen and Chang2019). When anime characters show surprise, excitement, or happiness, for example, their eyes widen to comically unrealistic proportions. Similarly, sadness in anime characters is often expressed through a waterfall of tears literally splashing down their faces. These exaggerated displays of emotional expression in the faces of anime characters might make them especially clear and salient even to individuals with ASC or elevated autistic traits. Consistent with this idea, Berthoz et al. (Reference Berthoz, Lalanne, Crane and Hill2013), Rump et al. (Reference Rump, Giovannelli, Minshew and Strauss2009), and Song and Hakoda (Reference Song and Hakoda2018) found that individuals with ASC were less impaired in their facial emotion recognition when presented with exaggerated, overt emotional expressions, but had more difficulty accurately interpreting subtle facial emotions. Rozema (Reference Rozema2015) suggested that the visual tropes used in media like anime to communicate emotions provide clear and repetitive visual cues, allowing individuals on the autism spectrum to recognize and remember underlying patterns of emotional expressions more easily. As a result, individuals with ASC or elevated autistic traits may develop a particular affinity for anime relative to other interests.

While it has been suggested that individuals on the autism spectrum prefer anime because they can more easily understand the often-exaggerated facial emotions of the characters, no previous studies of facial emotion recognition have used anime characters to test this possibility. Furthermore, there have not been many previous studies that directly examined facial emotion recognition for animated or cartoon faces in individuals with ASC or elevated autistic traits. In one study, Rosset et al. (Reference Rosset, Rondan, Da Fonseca, Santos, Assouline and Deruelle2008) found that children with ASC processed cartoon faces like typically developing children processed human faces, and their facial emotion recognition when viewing cartoon faces was not impaired to the same degree as it was when viewing human faces. Brosnan et al.’s (Reference Brosnan, Johnson, Grawmeyer, Chapman and Benton2015) study supported and expanded these findings, showing that adolescents with ASC not only performed better when viewing cartoon faces compared with human faces, but they also outperformed those without ASC in accurately recognizing emotions expressed in static cartoon faces. Similarly, Cross et al. (Reference Cross, Farha and Atherton2019) found that adolescents with ASC showed improvement in facial emotion recognition when human faces were put through an animal filter and thus appeared like anthropomorphic lion or gorilla faces. In another study, Atherton and Cross (Reference Atherton and Cross2019) showed that the use of animal versus human characters in social stories was related to improved recognition of social norm violations among those high in autistic traits. Finally, two recent studies examined the extent to which ASC (Cross et al., Reference Cross, Piovesan and Atherton2022) or elevated autistic traits (Atherton & Cross, Reference Atherton and Cross2022) were related to facial emotion recognition accuracy using both a human and a cartoon version of a test for reading emotions in the eyes. They found that adults with ASC were no different, and adults with elevated autistic traits were significantly worse than those without ASC or with low autistic traits on the human test. However, the group with ASC performed better, and the group with elevated autistic traits no differently, when the cartoon version of the test was used. These findings suggest that facial emotion recognition deficits in individuals with ASC or high in autistic traits are not global, but specific to the evaluation of human faces.

Autism spectrum, alexithymia, and facial emotion recognition

Alexithymia is a subclinical trait characterized by difficulties in identifying and describing feelings, associating bodily sensations with specific feeling states, and using words to express emotions (Berthoz et al., Reference Berthoz, Pouga, Wessa, Decety and Cacioppo2011; Nemiah et al., Reference Nemiah, Freyberger, Sifneos and Hill1976). Like autistic traits, alexithymia appears to exist along a continuum, with higher levels reflecting more difficulty in the cognitive processing and regulation of emotions (Taylor et al., Reference Sucksmith, Allison, Baron-Cohen, Chakrabarti and Hoekstra1997). Studies have revealed a much higher rate of alexithymia among individuals on the autism spectrum relative to the general population, with estimates for autistic populations ranging from 50%–85% (Berthoz & Hill, Reference Berthoz and Hill2005; Berthoz et al., Reference Berthoz, Lalanne, Crane and Hill2013; Hill et al., Reference Hill, Berthoz and Frith2004). For comparison, the prevalence rate of alexithymia in the general population is estimated to be closer to 10% (Linden et al., Reference Linden, Wen, Paulhus, Butcher and Spielberger1995).

Some researchers have found evidence that the difficulties in emotion recognition observed among individuals on the autism spectrum might not be the result of social deficits characteristic of ASC or autistic traits, but rather the frequently co-existing condition of alexithymia (Bernhardt et al., Reference Bernhardt, Valk, Silani, Bird, Frith and Singer2014; Bird & Cook, Reference Bird and Cook2013; Cook et al., Reference Cook, Brewer, Shah and Bird2013; Heaton et al., Reference Heaton, Reichenbacher, Sauter, Allen, Scott and Hill2012; Milosavljevic et al., Reference Milosavljevic, Carter Leno, Simonoff, Baird, Pickles, Jones, Erskine, Charman and Happé2016; Oakley et al., Reference Oakley, Brewer, Bird and Catmur2016; Ola & Gullon-Scott, Reference Ola and Gullon-Scott2020; Santiesteban et al., Reference Santiesteban, Gibbard, Drucks, Clayton, Banissy and Bird2021; Trevisan et al., Reference Trevisan, Bowering and Birmingham2016). Indeed, one study found that reduced eye gaze to facial emotional expressions (e.g., lower fixation count and duration) was associated more with alexithymia than with either ASC or autistic traits in a sample of adults with and without a formal diagnosis (Cuve et al., Reference Cuve, Castiello, Shiferaw, Ichijo, Catmur and Bird2021). In another study examining the role of alexithymia and anxiety in the relationship between empathic ability and autistic traits, Brett and Maybery (Reference Brett and Maybery2022) found that autistic traits predicted empathy through alexithymia and anxiety.

Based on their systematic review and meta-analysis of 15 studies, Kinnaird et al. (Reference Kinnaird, Stewart and Tchanturia2019) concluded that it was common but not universal for individuals on the autism spectrum to experience co-occurring alexithymia, finding a mean prevalence rate of 50%. They further suggested that emotional recognition difficulties traditionally associated with ASC or autistic traits might be better explained by alexithymia, following previous research (e.g., Bird & Cook, Reference Bird and Cook2013). Most recently, results from factor analysis of measures widely used to assess alexithymia and autistic traits indicated that these two constructs were distinct, because items from each measure loaded on separate factors (Cuve et al., Reference Cuve, Murphy, Hobson, Ichijo, Catmur and Bird2022). Network analyses similarly produced separate clusters comprising only of items representing either alexithymic or autistic traits.

Considering that alexithymia might characterize approximately half or more of the autistic population, other researchers have argued that alexithymia is a core feature or consequence of ASC or autistic traits rather than a distinct condition (Gaigg, Reference Gaigg2012; Quattrocki & Friston, Reference Quattrocki and Friston2014). They have also pointed to the problems of relying on self-report measures to assess alexithymic and autistic traits, noting that self-report could bias the conclusions drawn from this body of research (Gaigg et al., Reference Gaigg, Cornell and Bird2018). The significant overlap between alexithymia and ASC or autistic traits has led to the theory that they share the same underlying developmental process leading to their observed social and emotional deficits. Several neuroimaging studies have contributed evidence to support this theory (Grynberg et al., Reference Grynberg, Chang, Corneille, Maurage, Vermeulen, Berthoz and Luminet2012; Moriguchi et al., Reference Moriguchi, Ohnishi, Lane, Maeda, Mori, Nemoto, Matsuda and Komaki2006; Silani et al., Reference Silani, Bird, Brindley, Singer, Frith and Frith2008), showing that there are some neural components potentially shared between individuals high in alexithymia and individuals on the autism spectrum, specifically those brain regions responsible for perspective taking and mentalizing (i.e., understanding the mental states of the self and others).

Discussing the complex relationship between alexithymia and ASC, Poquérusse et al. (Reference Poquérusse, Pastore, Dellantonio and Esposito2018) acknowledged that there is no clear answer as to whether they are co-occurring conditions or whether alexithymia is a common component of ASC. In the absence of consensus on whether ASC and alexithymia (or autistic and alexithymic traits more broadly) may be considered distinct but commonly co-occurring conditions relevant to facial emotion recognition deficits, our hypotheses and study design were guided by an understanding that it would be important to evaluate the role of both autistic and alexithymic traits on facial emotion recognition.

The present research

To address gaps in the previous literature, the present study used a computerized task to examine not only whether individuals higher in autistic traits are less accurate in recognizing the facial emotional expressions of human characters, consistent with past studies, but also whether they do not experience this same difficulty in recognizing emotions when presented with anime faces. In other words, we hypothesized that elevated autistic traits would be associated with lower emotion recognition scores when human faces are used as targets, but not when anime faces are used as targets. Anime faces were chosen because they are drawn with exaggerated facial expressions that may help to reduce some of the visual processing deficits associated with ASC and autistic traits (Atherton et al., Reference Atherton, Morimoto, Nakashima and Cross2023; Liu et al., Reference Liu, Chen and Chang2019; Rozema, Reference Rozema2015), and because there is a demonstrated preference for anime among individuals in the autistic community (Kuo et al., Reference Kuo, Orsmond, Coster and Cohn2014; South et al., Reference South, Ozonoff and McMahon2005).

A measure of alexithymia was included to examine whether higher scores on this subclinical trait are also related to more difficulty recognizing human and anime facial emotions, and whether alexithymia can fully or in part explain the associations between autistic traits and facial emotion recognition. Although we hypothesized that elevated alexithymia would be associated with lower human facial emotion recognition scores, we did not have a strong prediction about the relationship between alexithymia and anime facial emotion recognition scores. However, we expected that when autistic traits and alexithymia control for one another in predicting human and anime facial emotion recognition scores, alexithymia would explain more of the variance in those facial emotion recognition scores than autistic traits.

If individuals higher in autistic traits do not show the same difficulty recognizing the facial emotional expressions of anime characters as they do the expressions of human characters, these findings could inform the development of more effective clinical interventions aimed at improving facial emotion recognition and positive social behaviors among individuals on the autism spectrum. If alexithymia is found to be more relevant than autistic traits in explaining facial emotion recognition, interventions could more specifically target alexithymia.

Method

Participants

Participants were 247 adults (129 male, 115 female, 2 non-binary, and 1 other) ranging in age from 18 to 63 years old (M = 27.65 years, SD = 10.63). They included 125 who were recruited from the psychology participant pool at Pennsylvania State University Abington, and another 122 from the crowdsourcing website Amazon Mechanical Turk (https://www.mturk.com/mturk/welcome). Amazon Mechanical Turk connects paid volunteers with researchers who are looking for participants to complete their studies. Participants engaged in this online study on their own computer and at their own convenience by first choosing from a list of available studies either through Amazon Mechanical Turk or the psychology participant pool, and then by being directed to a website to complete the study. After completing the study, participants from the psychology participant pool received credits toward a research requirement in their psychology course, and those recruited from Amazon Mechanical Turk were compensated $1.00. A wage analysis of 4,500 studies hosted on Amazon Mechanical Turk between 2015 and 2019 estimated that participants were completing studies for an average of $6.00–$7.00 an hour (Moss et al., Reference Moss, Rosenzweig, Robinson, Jaffe and Litman2023). Because our study had a median completion time of about 10 minutes, payment of $1.00 closely approximated the average hourly earnings for participants on Amazon Mechanical Turk. Furthermore, we were limited by funding constraints. This study and its procedures (including compensation) were approved by Pennsylvania State University’s Institutional Review Board.

Participants from Amazon Mechanical Turk were significantly older (M = 35.45 years, SD = 8.90) than those from the psychology participant pool (M = 20.03 years, SD = 5.33), t(197) = 16.46, p < .0001. This age difference was expected, because we intentionally recruited participants from Amazon Mechanical Turk to obtain a more representative sample, knowing that the psychology participant pool is only open to undergraduate students at our campus. While the reliability and validity of data collected on Amazon Mechanical Turk have been supported by some researchers (Buhrmester et al., Reference Buhrmester, Talaifar and Gosling2018; Paolacci & Chandler, Reference Paolacci and Chandler2014), others have recently raised concerns about data collection on this platform (Chmielewski & Kucker, Reference Chmielewski and Kucker2020; Newman et al., Reference Newman, Bavik, Mount and Shao2021; Webb & Tangney, Reference Webb and Tangney2024).

Participants from either recruitment source were excluded from all data analyses if they did not reach the end of the study (n = 18) or completed it in less than five minutes (n = 8). We further excluded those participants who provided the same rating to all 10 items on the Short Autism Spectrum Quotient (n = 5) or to all 23 items on the Revised Toronto Alexithymia Scale (n = 8), because these response patterns suggest that they were not paying close attention to the questions (see next section for descriptions of these measures). The numbers of participants excluded for these reasons were not mutually exclusive, and participants excluded for one reason often qualified to be excluded for another reason. Interestingly, all participants excluded for not reaching the end of the study or completing it in less than five minutes were from Amazon Mechanical Turk, while all but two participants excluded for providing the same rating to all items on at least one of the measures were from the psychology participation pool.

Measures

Short Autism Spectrum Quotient

A short form of the Autism Spectrum Quotient (AQ-10; Allison et al., Reference Allison, Auyeung and Baron-Cohen2012) was previously developed and validated for measuring autistic traits with 10 items. Example items include statements intended to assess social deficits: “I find it difficult to work out people’s intentions,” or “I find it easy to work out what someone is thinking or feeling just by looking at their face.” Example items also include statements intended to assess unusually narrow, rigid, or atypical interests, among other autistic traits: “I like to collect information about categories of things (e.g. types of car, types of bird, types of train, types of plant)” and “I often notice small sounds when others do not.” For each of the 10 items, participants were asked whether they definitely agree, slightly agree, slightly disagree, or definitely disagree with the statement. On normally coded items, responses of definitely agree or slightly agree earned one point. On reverse coded items, responses of definitely disagree or slightly disagree earned one point. Thus, scores on this measure range from 0–10. Higher scores on the AQ-10 indicate higher levels of autistic traits. The internal consistency of the AQ-10 in this sample was low (α = 0.45) but consistent with previous findings (Jia et al., Reference Jia, Steelman and Jia2019; Sizoo et al., Reference Sizoo, Horwitz, Teunisse, Kan, Vissers, Forceville, Van Voorst and Geurts2015; Taylor et al., Reference Taylor, Livingston, Clutterbuck and Shah2020).

Revised Toronto Alexithymia Scale

The Revised Toronto Alexithymia Scale (TAS-R; Taylor et al., Reference Taylor, Bagby and Parker1992) is a 23-item scale used to measure alexithymia, which comprises the inability (or limited ability) of a person to experience, identify, or describe emotions, or to relate them to the stimulus that caused them. Example items include statements such as: “I have feelings that I can’t quite identify,” or “I often don’t know why I am angry.” Participants were asked to indicate on a 5-point Likert scale the extent to which they agree or disagree with each item (1 = strongly disagree, 2 = moderately disagree, 3 = neither agree nor disagree, 4 = moderately agree, or 5 = strongly agree). Of the 23 items, six are reverse coded, such that responses of strongly agree or moderately agree are scored as 1 and 2, respectively, and responses of moderately disagree or strongly disagree are scored as 4 and 5, respectively. A total score on the TAS-R is obtained by taking the sum of responses to all 23 items, resulting in a range of 23 to 115, with higher scores indicating higher levels of alexithymia. Taylor et al. (Reference Taylor, Bagby and Parker1992) found that the mean score for individuals clinically judged to meet criteria for alexithymia was 66.4 (SD = 13.4), while the mean score for individuals who were not judged to meet criteria was 56.7 (SD = 12.2). The internal consistency of the TAS-R in this sample was high (α = 0.88).

Facial emotion recognition test

Participants completed a facial emotion recognition test on their computer. First, they were randomly presented with two human and two anime faces to become familiar with the task. This brief practice was then followed by a series of 12 human and 12 anime faces presented in random order. Each face expressed one of six different emotions: happiness, sadness, anger, fear, disgust, or surprise. These emotions were chosen based on previous research, which found that the facial expressions of these six emotions are universally recognizable across all cultures (Ekman & Friesen, Reference Ekman and Friesen1971; Izard et al., Reference Izard, Kagan and Zajonc1984; Izard, Reference Izard1994; Matsumoto, Reference Matsumoto and Matsumoto2001). During the test, each face was displayed for three seconds before disappearing, at which point the participant was asked to choose which of the six emotions was expressed by the face they were just shown. Participants earned one point for each facial expression they identified accurately, for a total of 12 possible points on the human section of the facial emotion recognition test, and 12 possible points on the anime section. The timing of three seconds was chosen after reviewing a previous study that analyzed multiple facial emotion recognition tests, finding that two to three seconds was ideal for preventing ceiling effects, wherein neurotypical individuals can too easily receive perfect scores (Wilhelm et al., Reference Wilhelm, Hildebrandt, Manske, Schacht and Sommer2014).

The human faces in the facial emotion recognition test were sourced from a previously validated set of faces with differing expressions, called the Warsaw Set of Emotional Facial Expression Pictures (Olszanowski et al., Reference Olszanowski, Pochwatko, Kuklinski, Scibor-Rylski, Lewinski and Ohme2015). This collection includes front-facing images of 30 White European individuals with acting experience, expressing a variety of emotions. These images were rated by over 1,300 independent judges to determine how well the facial expressions matched the intended emotion. For the present study, we chose the six male and six female human faces that were determined by judges to most accurately depict each of the six universally recognizable emotions. Thus, the final stimulus set included 12 different human faces: a male and a female face expressing happiness, a male and a female face expressing sadness, a male and a female face expressing anger, a male and a female face expressing fear, a male and a female face expressing disgust, and a male and a female face expressing surprise. For the practice section of the facial emotion recognition test, we used a male human face expressing sadness and a female human face expressing happiness that were different from those included in the final stimulus set.

Because no previously validated set of anime facial expressions existed, images were collected by the first author using Google Images for the sole purposes of this study. Using the search terms anime, manga, facial expressions, emotion, and facial emotional expressions in combination with one another, a preliminary compilation of 55 images was created. Due to constraints in time and funding, we could not recruit a separate sample for a pilot study in which these images of anime faces could be systematically rated on how much they represented the emotions that they were supposed to depict. Because we could not validate the anime stimuli with a pilot study, we reviewed and discussed the images among ourselves before choosing the 12 images that were determined to best express each of the six universally recognizable emotions. The final stimulus set included one male and one female anime face expressing happiness, one male and one female anime face expressing sadness, two male and one female anime face expressing anger, two female anime faces expressing fear, one male and one female anime face expressing disgust, and one male anime face expressing surprise. The same male anime character was used to express three of the six emotions (surprise, sadness, and happiness), while the other nine anime characters were unique. For the practice section of the facial emotion recognition test, we used a male anime face expressing happiness and the same male anime face expressing anger. This anime character was also different from the others. Because anime faces are less familiar to the general population, we provide the complete set of images (both human and anime faces) used in the facial emotion recognition test at the following link: https://osf.io/qyzgd.

Potentially confounding variables

To address potential confounds, participants answered the following two additional questions about their frequency of social interaction and their frequency of anime or manga use, respectively: “Approximately how often do you have face-to-face social interactions with others?” and “Approximately how often do you watch anime or read manga?” Responses to both questions were on an 8-point scale (1 = never, 2 = less than once per year, 3 = once per year, 4 = once per month, 5 = once per week, 6 = 2–3 times per week, 7 = once per day, or 8 = every day, multiple times per day).

Results

Preliminary analyses

Table 1 presents the means, standard deviations, and correlations among the study variables, including age, frequency of social interaction, frequency of anime or manga use, AQ-10, TAS-R, human facial emotion recognition score, and anime facial emotion recognition score. Due to the non-normal distributions for most of the study variables, Spearman’s rank order correlation coefficients were used for tests of associations instead of Pearson’s correlation coefficients.

Table 1. Means, standard deviations, and correlations among study variables

Note. N = 247. All correlations used Spearman’s ρ. AQ-10 = Short Autism Spectrum Quotient (Allison et al., Reference Allison, Auyeung and Baron-Cohen2012); TAS-R = Revised Toronto Alexithymia Scale (Taylor et al., Reference Taylor, Bagby and Parker1992). *p < .05, **p < .005, ***p < .0001.

As shown in Table 1, participants reported high frequencies of social interaction (i.e., the relevant distribution was negatively skewed), but moderate and more varied frequencies of anime or manga use. Scores on the AQ-10 also varied, and their distribution was positively skewed. Interestingly, 40 participants (16% of the sample) scored 6 or higher on this scale, which is the cutoff that has been used to refer individuals for the assessment of clinically diagnosed ASC (Allison et al., Reference Allison, Auyeung and Baron-Cohen2012). Scores on the TAS-R showed considerable variation and a slight negative skew in their distribution. Finally, the distribution of scores on the human section of the facial emotion recognition test was negatively skewed, with most participants correctly identifying the emotions of more than 9 out of 12 faces, while scores on the anime section were more normally distributed.

There was a strong positive correlation between scores on the human section and scores on the anime section of the facial emotion recognition test, r s (245) = .47, p < .0001. Furthermore, the potentially confounding variables of age, frequency of social interaction, and frequency of anime or manga use were significantly correlated not only with TAS-R, but also with scores on the human and anime facial emotion recognition test (see Table 1). The one exception was that frequency of anime or manga use was negatively, but not significantly correlated with anime facial emotion recognition scores, r s (245) = −.07, p = .2967.

Correlations of AQ-10 and TAS-R with human and anime facial emotion recognition

Table 1 shows that in accordance with our hypothesis, higher scores on the AQ-10 were negatively and significantly correlated with performance on the human facial emotion recognition test, r s (245) = −.20, p = .0015, but not with performance on the anime facial emotion recognition test, r s (245) = −.05, p = .3937. Thus, participants with more autistic traits tended to have more difficulty in accurately recognizing emotional expressions in human faces. When anime faces were presented, however, those higher in autistic traits did not show significantly more difficulty with emotion recognition than did those lower in autistic traits. Figure 1 visually depicts the correlations between scores on the AQ-10 and scores on the human and anime facial emotion recognition tests in the form of two scatterplots.

Figure 1. Correlations of autistic traits with performance on the human (left) and anime (right) facial emotion recognition tests. Note. N = 247. Points represent individual participants. Shaded regions represent 95% confidence intervals. AQ-10 = Short Autism Spectrum Quotient (Allison et al., Reference Allison, Auyeung and Baron-Cohen2012).

Table 1 shows that higher scores on the TAS-R were negatively and significantly correlated with performance on both the human facial emotion recognition test, r s (245) = −.44, p < .0001, and the anime facial emotion recognition test, r s (245) = −.37, p < .0001. In contrast to the AQ-10, TAS-R showed much stronger and more significant associations with both human and anime versions of the facial emotion recognition test. Participants with more alexithymia tended to have more difficulty with recognizing the emotions in faces regardless of whether they were human or anime, although the correlation was smaller for anime faces. Figure 2 visually depicts the correlations between scores on the TAS-R and scores on the human and anime facial emotion recognition tests in the form of two scatterplots.

Figure 2. Correlations of alexithymia with performance on the human (left) and anime (right) facial emotion recognition tests. Note. N = 247. Points represent individual participants. Shaded regions represent 95% confidence intervals. TAS-R = Revised Toronto Alexithymia Scale (Taylor et al., Reference Taylor, Bagby and Parker1992).

Multiple regression analyses

Two hierarchical multiple regression analyses were conducted to determine the extent to which autistic traits and alexithymia were unique predictors of performance on both the human and anime facial emotion recognition tests. In the first model of the hierarchical multiple regression, we entered age, frequency of social interaction, and frequency of anime or manga use as predictor variables, because these variables were significantly associated with performance on either or both of the facial emotion recognition tests. In the second model of the hierarchical multiple regression, we entered the two competing variables of interest, AQ-10 and TAS-R, as predictor variables in addition to those included in the first model. The dependent variables for these two hierarchical multiple regression analyses were scores on the human and anime facial emotion recognition tests. Before conducting multiple regression analyses, the variance inflation factors of predictor variables were examined to ensure a lack of multicollinearity among them. Variance inflation factors did not exceed 1.30, which indicated that each predictor variable contained sufficiently unique information over and above that provided by the others.

Table 2 presents the results of the two hierarchical multiple regression analyses among participants. In the first model, it was revealed that age (β = −0.18, p = .0043), frequency of social interaction (β = 0.19, p = .0022), and frequency of anime or manga use (β = −0.30, p < .0001) were unique and significant predictors of human facial emotion recognition scores, even with each predictor variable controlling for one another. Thus, participants who were older and watched more anime or read more manga scored lower on the human facial emotion recognition test, while those who had more frequent social interactions scored higher. For anime facial emotion recognition scores, only age (β = −0.37, p < .0001) and frequency of social interaction (β = 0.15, p = .0160) were unique and significant predictors controlling for one another. Participants who were older scored lower on the anime facial emotion recognition test, while those who had more frequent social interactions scored higher. Frequency of anime or manga use (β = –0.05, p = .3627), however, was not significantly associated with performance on the anime facial emotion recognition test.

Table 2. Hierarchical multiple regression results for age, frequency of social interaction, frequency of anime or manga use, autistic traits, and alexithymia predicting human and anime facial emotion recognition scores

Note. N = 247. All values are partial regression coefficients (standardized beta weights) or standard errors (as indicated in parentheses) except for those under F, R 2, and R 2adj, which are the F ratio, coefficient of determination, and adjusted coefficient of determination, respectively. AQ-10 = Short Autism Spectrum Quotient (Allison et al., Reference Allison, Auyeung and Baron-Cohen2012); TAS-R = Revised Toronto Alexithymia Scale (Taylor et al., Reference Taylor, Bagby and Parker1992). *p < .05, **p < .005, ***p < .0001.

In the second model, with the addition of AQ-10 and TAS-R as predictor variables, it was revealed that age (β = −0.15, p = .0140), frequency of social interaction (β = 0.14, p = .0198), frequency of anime or manga use (β = −0.22, p = .0002), and TAS-R (β = −0.28, p < .0001) were unique and significant predictors of human facial emotion recognition scores, even with each variable controlling for one another. Thus, participants who were older, watched more anime or read more manga, and experienced more alexithymia scored lower on the human facial emotion recognition test, while those who had more frequent social interactions scored higher. For anime facial emotion recognition scores, only age (β = −0.33, p < .0001) and TAS-R (β = −0.27, p < .0001) were unique and significant predictors controlling for one another. Participants who were older and experienced more alexithymia scored lower on the anime facial emotion recognition test, and their frequency of social interaction (β = 0.11, p = .0786) no longer significantly predicted their performance. Regardless of whether the multiple regression model was predicting performance on the human or anime facial emotion recognition test, autistic traits (β = −0.04, p = .4523 and β = 0.01, p = .8356, respectively) were not significantly predictive after controlling for alexithymia and the other three variables, in contrast to the zero-order correlations.

The first model explained about 20% of the variance in both human [F(3, 243) = 19.92, p < .0001] and anime facial emotion recognition scores [F(3, 243) = 20.48, p < .0001] among participants. The second and final model improved slightly on the first model, explaining about 27% of the variance in human facial emotion recognition scores [F(5, 241) = 18.03, p < .0001] and about 26% of the variance in anime facial emotion recognition scores [F(5, 241) = 17.09, p < .0001] among participants. For human facial emotion recognition scores, TAS-R was the strongest contributor, followed by frequency of anime or manga use. For anime facial emotion recognition scores, age was the strongest contributor to the variance, followed by TAS-R.

Mediation analyses

The zero-order correlation between AQ-10 and human (but not anime) facial emotion recognition scores was significant, as was the zero-order correlation between AQ-10 and TAS-R. After controlling for TAS-R in the second model of the hierarchical multiple regression analyses, however, AQ-10 was no longer significantly associated with performance on the human facial emotion recognition test. These results suggest that TAS-R fully mediated the relationship between AQ-10 and human facial emotion recognition scores, based on the procedures from Baron and Kenny (Reference Baron and Kenny1986). To further investigate this possibility, mediation analyses were conducted to determine the extent to which alexithymia mediated the relationship between autistic traits and performance on both the human and anime facial emotion recognition tests. We tested for mediation of autistic traits and anime facial emotion recognition scores through alexithymia, despite finding no significant zero-order correlation between AQ-10 and anime emotion scores, because many researchers have suggested that mediation is still possible under such conditions (Collins et al., Reference Collins, Graham and Flaherty1998; Preacher & Hayes, Reference Preacher and Hayes2004; Rucker et al., Reference Rucker, Preacher, Tormala and Petty2011; Shrout & Bolger, Reference Shrout and Bolger2002; Zhao et al., Reference Zhao, Lynch and Chen2010).

Figure 3 presents a path diagram for the mediation analysis in which AQ-10 predicted human facial emotion recognition scores through TAS-R. The standardized regression coefficient between AQ-10 and TAS-R was statistically significant (β = 0.31, p < .0001), as was the standardized regression coefficient between TAS-R and human facial emotion recognition scores (β = –0.41, p < .0001). The standardized indirect effect of AQ-10 on human facial emotion recognition scores was (0.31)( −0.41) = −0.13. We tested the significance of this indirect effect using bootstrapping procedures. Unstandardized indirect effects were computed for each of 1,000 bootstrapped samples, and the 95% confidence interval (CI) was computed with the indirect effects at the 2.5 and 97.5 percentiles. The bootstrapped unstandardized indirect effect of AQ-10 on human facial emotion recognition scores was −0.16, 95% CI = [−0.24, −0.09], p < .0001. These results show that despite the lack of a direct effect of AQ-10 on human facial emotion recognition scores (β = −0.02, p = .6913), there was a significant indirect effect of AQ-10 through TAS-R.

Figure 3. Path diagram for the mediation analysis in which autistic traits predicted human facial emotion recognition scores through alexithymia. Note. N = 247. Standardized regression coefficients are depicted for the relationship between autistic traits and performance on the human facial emotion recognition test as mediated by alexithymia. The standardized regression coefficient between autistic traits and human facial emotion recognition scores, controlling for alexithymia, is in parentheses. Brackets indicate 95% confidence intervals for each standardized regression coefficient. AQ-10 = Short Autism Spectrum Quotient (Allison et al., Reference Allison, Auyeung and Baron-Cohen2012); TAS-R = Revised Toronto Alexithymia Scale (Taylor et al., Reference Taylor, Bagby and Parker1992). *p < .0001.

Figure 4 presents a path diagram for the mediation analysis in which AQ-10 predicted anime facial emotion recognition scores through TAS-R. The standardized regression coefficient between AQ-10 and TAS-R was statistically significant (β = 0.31, p < .0001), as was the standardized regression coefficient between TAS-R and anime facial emotion recognition scores (β = −0.37, p < .0001). The standardized indirect effect of AQ-10 on anime facial emotion recognition scores was (0.31)(−0.37) = −0.12. We tested the significance of this indirect effect using bootstrapping procedures. Similar to the previous mediation analysis, unstandardized indirect effects were computed for each of 1,000 bootstrapped samples, and the 95% CI was computed with the indirect effects at the 2.5 and 97.5 percentiles. The bootstrapped unstandardized indirect effect of AQ-10 on anime facial emotion recognition scores was −0.16, 95% CI = [−0.25, –0.08], p < .0001. These results show that despite the lack of a direct effect of AQ-10 on anime facial emotion recognition scores (β = 0.07, p = .2971), there was a significant indirect effect of AQ-10 through TAS-R.

Figure 4. Path diagram for the mediation analysis in which autistic traits predicted anime facial emotion recognition scores through alexithymia. Note. N = 247. Standardized regression coefficients are depicted for the relationship between autistic traits and performance on the anime facial emotion recognition test as mediated by alexithymia. The standardized regression coefficient between autistic traits and anime facial emotion recognition scores, controlling for alexithymia, is in parentheses. Brackets indicate 95% confidence intervals for each standardized regression coefficient. AQ-10 = Short Autism Spectrum Quotient (Allison et al., Reference Allison, Auyeung and Baron-Cohen2012); TAS-R = Revised Toronto Alexithymia Scale (Taylor et al., Reference Taylor, Bagby and Parker1992). *p < .0001.

Comparing analyses between participants from the psychology participant pool and Amazon Mechanical Turk

Supplementary Tables 1–6 present the results of analyses from the previous sections, separated by whether participants were from the subsample recruited from the psychology participant pool (n = 125) or Amazon Mechanical Turk (n = 122). The same patterns emerged regardless of whether analyses used the full sample (N = 247) or either of the subsamples. Thus, our discussion of findings will only focus on those involving the full sample to avoid repetition.

Discussion

The purpose of this study was to examine whether individuals higher in autistic traits experience more difficulty with recognizing and interpreting the emotional expressions of human faces, while not showing such a deficit for anime faces. Based on previous research that compared the performance of individuals with ASC or elevated autistic traits on human and cartoon versions of facial emotion recognition tests (Atherton & Cross, Reference Atherton and Cross2022; Brosnan et al., Reference Brosnan, Johnson, Grawmeyer, Chapman and Benton2015; Cross et al., Reference Cross, Farha and Atherton2019, Reference Cross, Piovesan and Atherton2022; Rosset et al., Reference Rosset, Rondan, Da Fonseca, Santos, Assouline and Deruelle2008), we hypothesized that participants higher in autistic traits would perform worse at a facial emotion recognition test involving human stimuli but not one involving anime stimuli, due to their simplified and exaggerated expressions. We also hypothesized that participants higher in alexithymia would perform worse at the human facial emotion recognition test, and that alexithymia would be more strongly associated than autistic traits with scores on the human and anime facial emotion recognition tests, following previous studies (e.g., Bird & Cook, Reference Bird and Cook2013; Cook et al., Reference Cook, Brewer, Shah and Bird2013; Cuve et al., Reference Cuve, Castiello, Shiferaw, Ichijo, Catmur and Bird2021).

Our hypotheses were supported by the data. Participants higher in autistic traits performed significantly worse at the facial emotion recognition test featuring human stimuli. This finding supports both the validity of the human facial emotion recognition test and previous research demonstrating that individuals on the autism spectrum tend to have greater difficulty interpreting the meaning of human facial expressions (e.g., McKenzie et al., Reference McKenzie, Murray, Wilkinson, Murray, Metcalfe, O’Donnell and McCarty2018; Poljac et al., Reference Poljac, Poljac and Wagemans2013; Uljarevic & Hamilton, Reference Uljarevic and Hamilton2013).

In contrast to the significant negative correlation between scores on the AQ-10 and performance at recognizing human faces, there was no significant correlation between AQ-10 scores and performance on the anime facial emotion recognition test. This finding suggests that while it is not easier for individuals higher in autistic traits to recognize the facial expressions of anime characters compared with individuals lower in autistic traits, it is also not more difficult for them. Past work has found similar evidence that the facial emotion recognition of individuals with ASC or elevated autistic traits is less impaired when viewing cartoon as opposed to human faces (Atherton & Cross, Reference Atherton and Cross2022; Brosnan et al., Reference Brosnan, Johnson, Grawmeyer, Chapman and Benton2015; Cross et al., Reference Cross, Farha and Atherton2019, Reference Cross, Piovesan and Atherton2022; Rosset et al., Reference Rosset, Rondan, Da Fonseca, Santos, Assouline and Deruelle2008). We extend this evidence for the first time to anime faces, using a newly developed facial emotion recognition test. The exaggerated facial expressions that are characteristic of anime characters may operate as a protective factor against the deficits in facial emotion recognition typically seen in individuals with ASC or elevated autistic traits (Atherton et al., Reference Atherton, Morimoto, Nakashima and Cross2023; Liu et al., Reference Liu, Chen and Chang2019; Rozema, Reference Rozema2015). For this reason, individuals on the autism spectrum may be especially drawn to anime (Atherton & Cross, Reference Atherton and Cross2018), and it has been shown that they have an affinity for this type of media compared to other interests (Kuo et al., Reference Kuo, Orsmond, Coster and Cohn2014; South et al., Reference South, Ozonoff and McMahon2005). In our sample, more autistic traits were slightly correlated with more frequency of anime or manga use, although this relationship was not significant.

Consistent with our hypotheses and previous research (e.g., Bird & Cook, Reference Bird and Cook2013), the multiple regression and mediation analyses revealed a potentially important caveat in the associations between autistic traits and facial emotion recognition, which is the overlapping relevance of alexithymia. In contrast to the zero-order correlation, autistic traits were no longer significantly predictive of facial emotion recognition in human faces after controlling for alexithymia and potential confounds like age, frequency of social interaction, and frequency of anime and manga use. Additionally, alexithymia was more strongly and negatively correlated with performance on the human and anime facial emotion recognition tests than autistic traits, both with respect to their zero-order correlations and the hierarchical multiple regression models in which they controlled for each other. Finally, alexithymia fully mediated the relationship between autistic traits and both human and anime facial emotion recognition scores. We found no direct effect of autistic traits on emotion recognition when viewing either human or anime faces. However, there was a significant indirect effect of autistic traits on human and anime facial emotion recognition through alexithymia. These findings suggest that while the difficulty recognizing facial emotional expressions, whether in human or anime faces, characterizes individuals high in autistic traits, the overlapping subclinical trait of alexithymia is one possible source of this difficulty. Importantly, this difficulty is attenuated when non-human anime faces are viewed. Future research and clinical interventions might consider the role of alexithymia when it comes to helping autistic individuals with facial emotion recognition.

Issues in the measurement of autism spectrum and facial emotion recognition

While individuals recruited for the current study were not given a formal clinical evaluation or diagnosis, they completed the AQ-10 (Allison et al., Reference Allison, Auyeung and Baron-Cohen2012), a short scale designed to measure autistic traits. Individuals with higher scores have more autistic traits, and those who score 6 or higher might be referred for an assessment to formally diagnose ASC but cannot be assumed to have such a diagnosis. Thus, a strength of our sample is that it consists of individuals from the community with varying degrees of autistic traits, allowing for more statistically powerful, robust, and generalizable tests of how individuals on the autism spectrum process facial emotional expressions. Indeed, our sample provided a good distribution of autistic traits as measured by the AQ-10, consistent with the idea that ASC are best conceptualized as the extreme end of a broader continuum of autistic traits (Allison et al., Reference Allison, Auyeung and Baron-Cohen2012; Baron-Cohen et al., Reference Baron-Cohen, Jolliffe, Mortimore and Robertson1997; Ruzich et al., Reference Ruzich, Allison, Smith, Watson, Auyeung, Ring and Baron-Cohen2016). Further supporting this idea, 40 (16%) of the 247 participants scored 6 or higher on the scale, suggesting that they might be referred for possible clinical diagnosis. Although recruiting individuals with a formal diagnosis of ASC may be applicable for some clinical or research purposes, it cannot account for variability on the autism spectrum or those individuals who may be on the autism spectrum but were never diagnosed. With the AQ-10, autistic individuals unaware of their condition would still be identified due to their heightened autistic traits.

The distributions of participants’ scores on both the human and anime facial emotion recognition tests suggest that they may be useful for future research. These two tests clearly distinguish those who can more easily recognize and identify facial emotional expressions from those who have more difficulty. Furthermore, their construct validity is evidenced by their strong and significant correlations with the measure of alexithymia and with each other. The mean score on the human version of the test was higher than the mean score on the anime version, which may reflect the fact that most individuals are less familiar with anime characters than with human characters. It may also reflect the use of human faces that were sourced from a collection of images that had been previously validated (Olszanowski et al., Reference Olszanowski, Pochwatko, Kuklinski, Scibor-Rylski, Lewinski and Ohme2015), whereas the anime faces used were found on Google Images by the first author. Future attempts to improve on the reliability and validity of these two facial emotion recognition tests would benefit from a more detailed item analysis of how participants performed with respect to the individual faces chosen (see e.g., Passarelli et al., Reference Passarelli, Masini, Bracco, Petrosino and Chiorri2018), but that endeavor is beyond the scope of this study.

Limitations and future directions

Contrary to two previous studies that found individuals with ASC performed better than those without ASC at recognizing emotions expressed in cartoon faces (Brosnan et al., Reference Brosnan, Johnson, Grawmeyer, Chapman and Benton2015; Cross et al., Reference Cross, Piovesan and Atherton2022), individuals higher in autistic traits in the present study were no better or worse at recognizing the facial expressions of anime characters. A similar lack of difference in performance between autistic and neurotypical individuals in facial emotion recognition when viewing cartoon faces was found in two other studies (Atherton & Cross, Reference Atherton and Cross2022; Rosset et al., Reference Rosset, Rondan, Da Fonseca, Santos, Assouline and Deruelle2008), one of which also recruited individuals varying in autistic traits rather than those diagnosed with ASC. Although we described it as a strength of our sample, it is also a limitation that we recruited participants from community populations rather than those who had been clinically diagnosed with ASC. A sample of individuals with ASC would have allowed more direct comparisons with much of the previous research on autism spectrum and facial emotion recognition, which has focused on clinical populations. Perhaps if the study were to have included participants clinically diagnosed with ASC instead of or in addition to participants with elevated autistic traits, our results would have been different. Importantly, findings that pertain to this sample of individuals varying in autistic traits cannot be generalized to individuals with ASC or those from clinical populations. We cannot make claims about individuals with ASC based on the present study.

Another major limitation that requires our cautious interpretation of findings is the use of the AQ-10 to measure autistic traits in this study. Because we used the short form instead of the full form of the Autism Spectrum Quotient (AQ; Baron-Cohen et al., Reference Baron-Cohen, Wheelwright, Skinner, Martin and Clubley2001), it is not possible to directly compare scores or results related to the AQ-10 from the present study to those using the full form AQ from some of the previous studies that helped provide the empirical basis for this work (Actis-Grosso et al., Reference Actis-Grosso, Bossi and Ricciardelli2015; Atherton & Cross, Reference Atherton and Cross2022; Poljac et al., Reference Poljac, Poljac and Wagemans2013). An additional concern with using the AQ-10 is that it showed low internal consistency as estimated with Cronbach’s alpha, both in our sample and in previous adult samples from the general population (Jia et al., Reference Jia, Steelman and Jia2019; Sizoo et al., Reference Sizoo, Horwitz, Teunisse, Kan, Vissers, Forceville, Van Voorst and Geurts2015; Taylor et al., Reference Taylor, Livingston, Clutterbuck and Shah2020). To be sure, the AQ-10 has been commonly used in research since it was developed, specifically as a more efficient and less time-consuming measure of autistic traits in the general population (e.g., Bertrams & Schlegel, Reference Bertrams and Schlegel2020; Forby et al., Reference Forby, Anderson, Cheng, Foulsham, Karstadt, Dawson, Pazhoohi and Kingstone2023; Gollwitzer et al., Reference Gollwitzer, Martel, McPartland and Bargh2019; Mason et al., Reference Mason, Ronald, Ambler, Caspi, Houts, Poulton, Ramrakha, Wertz, Moffitt and Happé2021; Pazhoohi et al., Reference Pazhoohi, Forby and Kingstone2021; Rudolph et al., Reference Rudolph, Lundin, Åhs, Dalman and Kosidou2018). We thus used the AQ-10 in this study because it had been used by other researchers and allowed us to substantially reduce the time it would take for participants to complete the study, an important consideration due to the limited amount of funding we had to compensate them. However, we must emphasize the limitations of the AQ-10 and strongly encourage caution when using it in future research, especially because it demonstrates low internal consistency. Results are potentially biased when measures with low internal consistency like the AQ-10 are used to predict constructs such as facial emotion recognition and compared to measures with much higher internal consistency like the TAS-R. We recommend that researchers use the full form AQ to avoid these issues with the short form AQ-10.

Similarly, the present study was limited by its use of the TAS-R to measure alexithymia. Although the TAS-R showed good reliability and validity in its initial development (Taylor et al., Reference Taylor, Bagby and Parker1992), it was revised and subsequently followed by the creation of the 20-item Toronto Alexithymia Scale (TAS-20; Bagby et al., Reference Bagby, Parker and Taylor1994), which had improved psychometric properties. In contrast to the TAS-R, this newer scale has also been much more commonly used to measure alexithymia in empirical research (Bagby et al., Reference Bagby, Parker and Taylor2020; Kooiman et al., Reference Kooiman, Spinhoven and Trijsburg2002), making it more difficult to draw comparisons between the results of this study and many previous studies of alexithymia measured with the TAS-20. Unlike the reasons that motivated our use of the AQ-10, the reason that we used the TAS-R rather than the more commonly used and psychometrically improved TAS-20 was unintentional: When adding the questions for alexithymia to the survey software that was used to create this study, the Taylor et al. (Reference Taylor, Bagby and Parker1992) article was accidentally open and referenced instead of the correct Bagby et al. (Reference Bagby, Parker and Taylor1994) article. Future research should employ the full form AQ and the TAS-20, perhaps in addition to other measures of autistic traits and alexithymia, to test whether our findings can be replicated and whether they are robust.

A final limitation of the study concerns the facial emotion recognition tests. As previously mentioned, the mean accuracy score for emotion recognition of human faces was higher than the mean accuracy score for emotion recognition of anime faces, although perfect scores were earned by participants in both conditions. Anime is not an especially common interest, and participants showed moderate variation on their frequency of anime or manga use. Thus, it is possible that participants found it more difficult on average to interpret the expressions of the anime stimuli, regardless of whether they are high on autistic traits or alexithymia, due to a lack of familiarity with this style of media. We controlled for frequency of anime or manga use in a series of hierarchical multiple regression models, however, and found no evidence that this frequency was relevant to the associations between autistic traits or alexithymia and anime facial emotion recognition. Alternatively, participants might have found the emotional expressions of anime faces more difficult on average because the anime stimuli were not previously validated like the human stimuli. While we were not able to include a pilot study to ensure that our selected images accurately depicted different facial emotional expressions, we hope that these results encourage future researchers to employ a more rigorous selection process for anime faces to be used in facial emotion recognition tasks.

To improve on future studies examining the facial emotion recognition of anime characters, it might be helpful to create a validated collection of anime faces with different emotional expressions, comparable to the Warsaw Set of Emotional Facial Expression Pictures (Olszanowski et al., Reference Olszanowski, Pochwatko, Kuklinski, Scibor-Rylski, Lewinski and Ohme2015). This project might involve commissioning one or more sufficiently skilled artists to draw multiple original anime characters expressing a variety of facial emotions. Alternatively, one could also generate stimuli of anime faces expressing different emotions through the responsible use of artificial intelligence. These images would then need to be examined by a panel of independent judges to determine their validity and effectiveness in conveying the different emotions. Further research could also investigate whether it would be preferable to have images drawn in a consistent visual style, or in different styles to help encompass the artistic variety found in anime.

The most recent data from the Centers for Disease Control indicate that one in 54 children in the United States are diagnosed with ASC (Maenner et al., Reference Maenner, Shaw, Baio, Washington, Patrick, DiRienzo, Christensen, Wiggins, Pettygrove, Andrews, Lopez, Hudson, Baroud, Schwenk, White, Rosenberg, Lee, Harrington, Huston and Dietz2020). Thus, it is important for research to examine what might affect or attenuate the social deficits experienced by those on the autism spectrum or with autistic traits, such that more effective interventions can be developed. The current findings inform future interventions in at least two ways. First, our data suggest that individuals on the autism spectrum might benefit from interventions that specifically target alexithymia, because this subclinical trait was implicated as a potential mechanism through which autistic individuals experience deficits in facial emotion recognition. Second, our data suggest that interventions directed at improving social–emotional functioning in autistic populations might consider the use of anime rather than human characters to improve accessibility and effectiveness.

Supplementary material

The supplementary material for this article can be found at https://dx.doi.org/10.1017/S0954579425000100.

Acknowledgements

This research was supported by funding from the Abington College Undergraduate Research Activities (ACURA). We thank Meghan M. Gillen for reviewing an early version of the manuscript and providing helpful feedback. The complete set of images depicting human and anime faces used in the present study can be found at the following link: https://osf.io/qyzgd.

Funding statement

This research was supported by funding from the ACURA.

Competing interests

The authors declare none.

References

Actis-Grosso, R., Bossi, F., & Ricciardelli, P. (2015). Emotion recognition through static faces and moving bodies: A comparison between typically developed adults and individuals with high level of autistic traits. Frontiers in Psychology, 6, 1570. https://doi.org/10.3389/fpsyg.2015.01570 CrossRefGoogle ScholarPubMed
Allison, A. (2006). Millennial monsters: Japanese toys and the global imagination. University of California Press.Google Scholar
Allison, C., Auyeung, B., & Baron-Cohen, S. (2012). Toward brief, red flags, for autism screening: The short autism spectrum quotient and the short quantitative checklist in 1,000 cases and 3,000 controls. Journal of the American Academy of Child and Adolescent Psychiatry, 51(2), 202212. https://doi.org/10.1016/j.jaac.2011.11.003 CrossRefGoogle ScholarPubMed
American Psychiatric Association. (2013). Diagnostic and statistical manual of mental disorders (5th ed.). American Psychiatric Association.Google Scholar
Ashwin, C., Chapman, E., Colle, L., & Baron-Cohen, S. (2006). Impaired recognition of negative basic emotions in autism: A test of the amygdala theory. Social Neuroscience, 1(3-4), 349363. https://doi.org/10.1080/17470910601040772 Google ScholarPubMed
Atherton, G., & Cross, L. (2018). Seeing more than human: Autism and anthropomorphic theory of mind. Frontiers in Psychology, 9, 528. https://doi.org/10.3389/fpsyg.2018.00528 CrossRefGoogle ScholarPubMed
Atherton, G., & Cross, L. (2019). Animal faux pas: Two legs good four legs bad for theory of mind, but not in the broad autism spectrum. Journal of Genetic Psychology, 180(2-3), 8195. https://doi.org/10.1080/00221325.2019.1593100 CrossRefGoogle Scholar
Atherton, G., & Cross, L. (2022). Reading the mind in cartoon eyes: Comparing human versus cartoon emotion recognition in those with high and low levels of autistic traits. Psychological Reports, 125(3), 13801396. https://doi.org/10.1177/0033294120988135 Google ScholarPubMed
Atherton, G., Morimoto, Y., Nakashima, S., & Cross, L. (2023). Does the study of culture enrich our understanding of autism? A cross-cultural exploration of life on the spectrum in Japan and the West. Journal of Cross-Cultural Psychology, 54(5), 610634. https://doi.org/10.1177/00220221231169945 CrossRefGoogle Scholar
Bagby, R. M., Parker, J. D. A., & Taylor, G. J. (1994). The twenty-item Toronto Alexithymia Scale: I. Item selection and cross-validation of the factor structure. Journal of Psychosomatic Research, 38(1), 2332. https://doi.org/10.1016/0022-3999(94)90005-1 Google ScholarPubMed
Bagby, R. M., Parker, J. D. A., & Taylor, G. J. (2020). Twenty-five years with the 20-item Toronto Alexithymia Scale. Journal of Psychosomatic Research, 131, 109940. https://doi.org/10.1016/j.jpsychores.2020.109940 CrossRefGoogle ScholarPubMed
Baron, R. M., & Kenny, D. A. (1986). The moderator-mediator variable distinction in social psychological research: Conceptual, strategic, and statistical considerations. Journal of Personality and Social Psychology, 51(6), 11731182. https://doi.org/10.1037/0022-3514.51.6.1173 Google ScholarPubMed
Baron-Cohen, S., Jolliffe, T., Mortimore, C., & Robertson, M. (1997). Another advanced test of theory of mind: Evidence from very high functioning adults with autism or Asperger syndrome. Journal of Child Psychology and Psychiatry, 38(7), 813822. https://doi.org/10.1111/j.1469-7610.1997.tb01599.x CrossRefGoogle ScholarPubMed
Baron-Cohen, S., Wheelwright, S., Skinner, R., Martin, J., & Clubley, E. (2001). The Autism-Spectrum Quotient (AQ): Evidence from Asperger syndrome/high-functioning autism, males and females, scientists and mathematicians. Journal of Autism and Developmental Disorders, 31(1), 517. https://doi.org/10.1023/A:1005653411471 Google ScholarPubMed
Bernhardt, B. C., Valk, S. L., Silani, G., Bird, G., Frith, U., & Singer, T. (2014). Selective disruption of sociocognitive structural brain networks in autism and alexithymia. Cerebral Cortex, 24(12), 32583267. https://doi.org/10.1093/cercor/bht182 Google ScholarPubMed
Berthoz, S., & Hill, E. L. (2005). The validity of using self-reports to assess emotion regulation abilities in adults with autism spectrum disorder. European Psychiatry, 20(3), 291298. https://doi.org/10.1016/j.eurpsy.2004.06.01 CrossRefGoogle ScholarPubMed
Berthoz, S., Lalanne, C., Crane, L., & Hill, E. L. (2013). Investigating emotional impairments in adults with autism spectrum disorders and the broader autism phenotype. Psychiatry Research, 208(3), 257264. https://doi.org/10.1016/j.psychres.2013.05.014 CrossRefGoogle ScholarPubMed
Berthoz, S., Pouga, L., & Wessa, M. (2011). Alexithymia from the social neuroscience perspective. In Decety, J., & Cacioppo, J. T. (Eds.), The Oxford handbook of social neuroscience (pp. 907934). Oxford Library of Psychology. https://doi.org/10.1093/oxfordhb/9780195342161.013.0060 Google Scholar
Bertrams, A., & Schlegel, K. (2020). Speeded reasoning moderates the inverse relationship between autistic traits and emotion recognition. Autism, 24(8), 23042309. https://doi.org/10.1177/1362361320937090 CrossRefGoogle ScholarPubMed
Bird, G., & Cook, R. (2013). Mixed emotions: The contribution of alexithymia to the emotional symptoms of autism. Translational Psychiatry, 3(7), e285.https://doi.org/10.1038/tp.2013.61 CrossRefGoogle Scholar
Black, M. H., Chen, N. T. M., Iyer, K. K., Lipp, O. V., Bölte, S., Falkmer, M., Tan, T., & Girdler, S. (2017). Mechanisms of facial emotion recognition in autism spectrum disorders: Insights from eye tracking and electroencephalography. Neuroscience and Biobehavioral Reviews, 80, 488515.CrossRefGoogle ScholarPubMed
Brett, J. D., & Maybery, M. T. (2022). Understanding oneself to understand others: The role of alexithymia and anxiety in the relationships between autistic trait dimensions and empathy. Journal of Autism and Developmental Disorders, 52(5), 19711983. https://doi.org/10.1007/s10803-021-05086-6 CrossRefGoogle ScholarPubMed
Brosnan, M., Johnson, H., Grawmeyer, B., Chapman, E., & Benton, L. (2015). Emotion recognition in animated compared to human stimuli in adolescents with autism spectrum disorder. Journal of Autism and Developmental Disorders, 45(6), 17851796. https://doi.org/10.1007/s10803-014-2338-9 CrossRefGoogle ScholarPubMed
Buhrmester, M. D., Talaifar, S., & Gosling, S. D. (2018). An evaluation of Amazon’s Mechanical Turk, its rapid rise, and its effective use. Perspectives on Psychological Science, 13(2), 149154. https://doi.org/10.1177/1745691617706516 CrossRefGoogle ScholarPubMed
Celani, G., Battacchi, M. W., & Arcidiacono, L. (1999). The understanding of the emotional meaning of facial expressions in people with autism. Journal of Autism and Developmental Disorders, 29(1), 5766. https://doi.org/10.1023/a:1025970600181 CrossRefGoogle ScholarPubMed
Chevallier, C., Grèzes, J., Molesworth, C., Berthoz, S., & Happé, F. (2012). Brief report: Selective social anhedonia in high functioning autism. Journal of Autism and Developmental Disorders, 42(7), 15041509. https://doi.org/10.1007/s10803-011-1364-0 Google ScholarPubMed
Chmielewski, M., & Kucker, S. C. (2020). An MTurk crisis? Shifts in data quality and the impact on study results. Social Psychological and Personality Science, 11(4), 464473. https://doi.org/10.1177/1948550619875149 CrossRefGoogle Scholar
Clark, T. F., Winkielman, P., & McIntosh, D. N. (2008). Autism and the extraction of emotion from briefly presented facial expressions: Stumbling at the first step of empathy. Emotion, 8(6), 803809. https://doi.org/10.1037/a0014124 CrossRefGoogle ScholarPubMed
Collins, L. M., Graham, J. J., & Flaherty, B. P. (1998). An alternative framework for defining mediation. Multivariate Behavioral Research, 33(2), 295312.CrossRefGoogle ScholarPubMed
Cook, R., Brewer, R., Shah, P., & Bird, G. (2013). Alexithymia, not autism, predicts poor recognition of emotional facial expressions. Psychological Science, 24(5), 723732. https://doi.org/10.1177/0956797612463582 CrossRefGoogle Scholar
Cross, L., Farha, M., & Atherton, G. (2019). The animal in me: Enhancing emotion recognition in adolescents with autism using animal filters. Journal of Autism and Developmental Disorders, 49(11), 44824487. https://doi.org/10.1007/s10803-019-04179-7 CrossRefGoogle ScholarPubMed
Cross, L., Piovesan, A., & Atherton, G. (2022). Autistic people outperform neurotypicals in a cartoon version of the Reading the Mind in the Eyes. Autism Research, 15(9), 16031608. https://doi.org/10.1002/aur.2782 CrossRefGoogle Scholar
Cuve, H. C., Castiello, S., Shiferaw, B., Ichijo, E., Catmur, C., & Bird, G. (2021). Alexithymia explains atypical spatiotemporal dynamics of eye gaze in autism. Cognition, 212, 104710. https://doi.org/10.1016/j.cognition.2021.104710 CrossRefGoogle ScholarPubMed
Cuve, H. C., Gao, Y., & Fuse, A. (2018). Is it avoidance or hypoarousal? A systematic review of emotion recognition, eye-tracking, and psychophysiological studies in young adults with autism spectrum conditions. Research in Autism Spectrum Disorders, 55, 113. https://doi.org/10.1016/j.rasd.2018.07.002 CrossRefGoogle Scholar
Cuve, H. C., Murphy, J., Hobson, H., Ichijo, E., Catmur, C., & Bird, G. (2022). Are autistic and alexithymic traits distinct? A factor-analytic and network approach. Journal of Autism and Developmental Disorders, 52(5), 20192034. https://doi.org/10.1007/s10803-021-05094-6 CrossRefGoogle ScholarPubMed
Dawson, G., Webb, S. J., & McPartland, J. (2005). Understanding the nature of face processing impairment in autism: Insights from behavioral and electrophysiological studies. Developmental Neuropsychology, 27(3), 403424. https://doi:10.1207/s15326942dn2703_6 CrossRefGoogle ScholarPubMed
Ekman, P., & Friesen, W. V. (1971). Constants across cultures in the face and emotion. Journal of Personality and Social Psychology, 17(2), 124129. https://doi.org/10.1037/h0030377 Google ScholarPubMed
Forby, L., Anderson, N. C., Cheng, J. T., Foulsham, T., Karstadt, B., Dawson, J., Pazhoohi, F., & Kingstone, A. (2023). Reading the room: Autistic traits, gaze behaviour, and the ability to infer social relationships. PLoS ONE, 18(3), e0282310. https://doi.org/10.1371/journal.pone.0282310 Google ScholarPubMed
Gaigg, S. B. (2012). The interplay between emotion and cognition in autism spectrum disorder: Implications for developmental theory. Frontiers in Integrative Neuroscience, 6, 113. https://doi.org/10.3389/fnint.2012.00113 Google ScholarPubMed
Gaigg, S. B., Cornell, A. S., & Bird, G. (2018). The psychophysiological mechanisms of alexithymia in autism spectrum disorder. Autism, 22(2), 227231. https://doi.org/10.1177/1362361316667062 CrossRefGoogle ScholarPubMed
Golan, O., Baron-Cohen, S., & Hill, J. (2006). The Cambridge Mindreading (CAM) Face-Voice Battery: Testing complex emotion recognition in adults with and without Asperger syndrome. Journal of Autism and Developmental Disorders, 36(2), 169183. https://doi.org/10.1007/s10803-005-0057-y Google ScholarPubMed
Gollwitzer, A., Martel, C., McPartland, J. C., & Bargh, J. A. (2019). Autism spectrum traits predict higher social psychological skill. Proceedings of the National Academy of Sciences of the United States of America, 116(39), 1924519247. https://doi.org/10.1073/pnas.1911460116 CrossRefGoogle ScholarPubMed
Grynberg, D., Chang, B., Corneille, O., Maurage, P., Vermeulen, N., Berthoz, S., & Luminet, O. (2012). Alexithymia and the processing of emotional facial expressions (EFEs): Systematic review, unanswered questions and further perspectives. PLoS ONE, 7(8), e42429. https://doi.org/10.1371/journal.pone.0042429 Google ScholarPubMed
Guillon, Q., Rogé, B., Afzali, M. H., Baduel, S., Kruck, J., & Hadjikhani, N. (2016). Intact perception but abnormal orientation towards face-like objects in young children with ASD. Scientific Reports, 6, 22119. https://doi.org/10.1038/srep22119 CrossRefGoogle Scholar
Harms, M. B., Martin, A., & Wallace, G. L. (2010). Facial emotion recognition in autism spectrum disorders: A review of behavioral and neuroimaging studies. Neuropsychology Review, 20(3), 290322. https://doi.org/10.1007/s11065-010-9138-6 Google ScholarPubMed
Heaton, P., Reichenbacher, L., Sauter, D., Allen, R., Scott, S., & Hill, E. (2012). Measuring the effects of alexithymia on perception of emotional vocalizations in autistic spectrum disorder and typical development. Psychological Medicine, 42(11), 24532459. https://doi.org/10.1017/S0033291712000621 CrossRefGoogle ScholarPubMed
Hill, E., Berthoz, S., & Frith, U. (2004). Brief report: Cognitive processing of own emotions in individuals with autistic spectrum disorder and in their relatives. Journal of Autism and Developmental Disorders, 34(2), 229235. https://doi.org/https://doi.org CrossRefGoogle ScholarPubMed
Hobson, R. P., Ouston, J., & Lee, A. (1989). Naming emotion in faces and voices: Abilities and disabilities in autism and mental retardation. British Journal of Developmental Psychology, 7, 237250. https://doi.org/10.1111/j.2044-835X.1989.tb00803.x CrossRefGoogle Scholar
Ingersoll, B. (2010). Broader autism phenotype and nonverbal sensitivity: Evidence for an association in the general population. Journal of Autism and Developmental Disorders, 40(5), 590598. https://doi.org/10.1007/s10803-009-0907-0 CrossRefGoogle ScholarPubMed
Izard, C. E. (1994). Innate and universal facial expressions: Evidence from developmental and cross-cultural research. Psychological Bulletin, 115(2), 288299. https://doi.org/10.1037/0033-2909.115.2.288 CrossRefGoogle ScholarPubMed
Izard, C. E., Kagan, J., & Zajonc, R. B. (1984). Emotions, cognition, and behavior. Cambridge University Press.Google Scholar
Jia, R., Steelman, Z. R., & Jia, H. H. (2019). Psychometric assessments of three self-report autism scales (AQ, RBQ-2A, and SQ) for general adult populations. Journal of Autism and Developmental Disorders, 49(5), 19491965. https://doi.org/10.1007/s10803-019-03880-x CrossRefGoogle ScholarPubMed
Jones, C. R., Pickles, A., Falcaro, M., Marsden, A. J., Happé, F., Scott, S. K., Sauter, D., Tregay, J., Phillips, R. J., Baird, G., Simonoff, E., & Charman, T. (2011). A multimodal approach to emotion recognition ability in autism spectrum disorders. Journal of Child Psychology and Psychiatry, 52(3), 275285. https://doi.org/10.1111/j.1469-7610.2010.02328.x CrossRefGoogle ScholarPubMed
Kinnaird, E., Stewart, C., & Tchanturia, K. (2019). Investigating alexithymia in autism: A systematic review and meta-analysis. European Psychiatry, 55, 8089. https://doi.org/10.1016/j.eurpsy.2018.09.004 Google ScholarPubMed
Klin, A., Jones, W., Schultz, R., Volkmar, F., & Cohen, D. (2002). Visual fixation patterns during viewing of naturalistic social situations as predictors of social competence in individuals with autism. Archives of General Psychiatry, 59(9), 809816. https://doi.org/10.1001/archpsyc.59.9.809 CrossRefGoogle ScholarPubMed
Kooiman, C. G., Spinhoven, P., & Trijsburg, R. W. (2002). The assessment of alexithymia: A critical review of the literature and a psychometric study of the Toronto Alexithymia Scale-20. Journal of Psychosomatic Research, 53(6), 10831090. https://doi.org/10.1016/S0022-3999(02)00348-3 CrossRefGoogle Scholar
Kuo, M. H., Orsmond, G. I., Coster, W. J., & Cohn, E. S. (2014). Media use among adolescents with autism spectrum disorder. Autism, 18(8), 914923. https://doi.org/10.1177/1362361313497832 CrossRefGoogle ScholarPubMed
Linden, W., Wen, F., & Paulhus, D. L. (1995). Measuring alexithymia: Reliability, validity, and prevalence. In Butcher, J. N., & Spielberger, C. D. (Eds.), Advances in personality assessment (pp. 5195). Lawrence Erlbaum Associates.Google Scholar
Liu, K., Chen, J. H., & Chang, K. M. (2019). A study of facial features of American and Japanese cartoon characters. Symmetry, 11(5), 664. https://doi.org/10.3390/sym11050664 CrossRefGoogle Scholar
Macdonald, H., Rutter, M., Howlin, P., Rios, P., Le Conteur, A., Evered, C., & Folstein, S. (1989). Recognition and expression of emotional cues by autistic and normal adults. Journal of Child Psychology and Psychiatry, 30(6), 865877. https://doi.org/10.1111/j.1469-7610.1989.tb00288.x Google ScholarPubMed
Maenner, M. J., Shaw, K. A., Baio, J., Washington, A., Patrick, M., DiRienzo, M., Christensen, D. L., Wiggins, L. D., Pettygrove, S., Andrews, J. G., Lopez, M., Hudson, A., Baroud, T., Schwenk, Y., White, T., Rosenberg, C. R., Lee, LC., Harrington, R. A., Huston, M., & Dietz, P. M. (2020). Prevalence of autism spectrum disorder among children aged 8 years — Autism and developmental disabilities monitoring network, 11 sites, United States, 2016. MMWR Surveillance Summaries, 69(4), 112. https://doi.org/10.15585/mmwr.ss6904a1 CrossRefGoogle Scholar
Mason, D., Ronald, A., Ambler, A., Caspi, A., Houts, R., Poulton, R., Ramrakha, S., Wertz, J., Moffitt, T. E., & Happé, F. (2021). Autistic traits are associated with faster pace of aging: Evidence from the Dunedin study at age 45. Autism Research, 14(8), 16841694. https://doi.org/10.1002/aur.2534 CrossRefGoogle ScholarPubMed
Matsumoto, D. (2001). Culture and emotion. In Matsumoto, D. (Ed.), The handbook of culture and psychology (pp. 171194). Oxford University Press.Google Scholar
Mazurek, M. O., Shattuck, P. T., Wagner, M., & Cooper, B. P. (2012). Prevalence and correlates of screen-based media use among youths with autism spectrum disorders. Journal of Autism and Developmental Disorders, 42(8), 17571767. https://doi.org/10.1007/s10803-011-1413-8 CrossRefGoogle ScholarPubMed
McKenzie, K., Murray, A. L., Wilkinson, A., Murray, G. C., Metcalfe, D., O’Donnell, M., & McCarty, K. (2018). The relations between processing style, autistic-like traits, and emotion recognition in individuals with and without autism spectrum disorder. Personality and Individual Differences, 120, 16. https://doi.org/10.1016/j.paid.2017.08.007 CrossRefGoogle Scholar
Milosavljevic, B., Carter Leno, V., Simonoff, E., Baird, G., Pickles, A., Jones, C. R., Erskine, C., Charman, T., & Happé, F. (2016). Alexithymia in adolescents with autism spectrum disorder: Its relationship to internalising difficulties, sensory modulation and social cognition. Journal of Autism and Developmental Disorders, 46(4), 13541367. https://doi.org/10.1007/s10803-015-2670-8 Google ScholarPubMed
Miyahara, M., Bray, A., Tsujii, M., Fujita, C., & Sugiyama, T. (2007). Reaction time of facial affect recognition in Asperger’s disorder for cartoon and real, static and moving faces. Child Psychiatry and Human Development, 38(2), 121134. https://doi.org/10.1007/s10578-007-0048-7 CrossRefGoogle ScholarPubMed
Moriguchi, Y., Ohnishi, T., Lane, R. D., Maeda, M., Mori, T., Nemoto, K., Matsuda, H., & Komaki, G. (2006). Impaired self-awareness and theory of mind: An fMRI study of mentalizing in alexithymia. Neuroimage, 32(3), 14721482. https://doi.org/10.1016/j.neuroimage.2006.04.186 CrossRefGoogle ScholarPubMed
Moss, A. J., Rosenzweig, C., Robinson, J., Jaffe, S. N., & Litman, L. (2023). Is it ethical to use Mechanical Turk for behavioral research? Relevant data from a representative survey of MTurk participants and wages. Behavior Research Methods, 55(8), 40484067. https://doi.org/10.3758/s13428-022-02005-0 CrossRefGoogle ScholarPubMed
Napier, S. (2006). The world of anime fandom in America. Mechademia, 1(1), 4763. https://doi.org/10.1353/mec.0.0072 Google Scholar
Nemiah, J. C., Freyberger, H., & Sifneos, P. E. (1976). Alexithymia: A view of the psychosomatic process. In Hill, O. W. (Eds.), Modern trends in psychosomatic medicine (Vol. 3, pp. 430439). Butterworths.Google Scholar
Newman, A., Bavik, Y. L., Mount, M., & Shao, B. (2021). Data collection via online platforms: Challenges and recommendations for future research. Applied Psychology: An International Review, 70(3), 13801402. https://doi.org/10.1111/apps.12302 CrossRefGoogle Scholar
Oakley, B. F. M., Brewer, R., Bird, G., & Catmur, C. (2016). Theory of mind is not theory of emotion: A cautionary note on the Reading the Mind in the Eyes Test. Journal of Abnormal Psychology, 125(6), 818823. https://doi.org/10.1037/abn0000182 Google Scholar
Ola, L., & Gullon-Scott, F. (2020). Facial emotion recognition in autistic adult females correlates with alexithymia, not autism. Autism, 24(8), 20212034. https://doi.org/10.1177/1362361320932727 CrossRefGoogle Scholar
Olszanowski, M., Pochwatko, G., Kuklinski, K., Scibor-Rylski, M., Lewinski, P., & Ohme, R. K. (2015). Warsaw set of emotional facial expression pictures: A validation study of facial display photographs. Frontiers in Psychology, 5, 1516. https://doi.org/10.3389/fpsyg.2014.01516 CrossRefGoogle ScholarPubMed
Otmazgin, N. (2014). Anime in the US: The entrepreneurial dimensions of globalized culture. Pacific Affairs, 87(1), 5369. https://doi.org/10.5509/201487153 CrossRefGoogle Scholar
Paolacci, G., & Chandler, J. (2014). Inside the Turk: Understanding Mechanical Turk as a participant pool. Current Directions in Psychological Science, 23(3), 184188. https://doi.org/10.1177/0963721414531598 CrossRefGoogle Scholar
Passarelli, M., Masini, M., Bracco, F., Petrosino, M., & Chiorri, C. (2018). Development and validation of the Facial Expression Recognition Test (FERT). Psychological Assessment, 30(11), 14791490. https://doi.org/10.1037/pas0000595 CrossRefGoogle ScholarPubMed
Pazhoohi, F., Forby, L., & Kingstone, A. (2021). Facial masks affect emotion recognition in the general population and individuals with autistic traits. PLoS ONE, 16(9), e0257740. https://doi.org/10.1371/journal.pone.0257740 CrossRefGoogle ScholarPubMed
Pelphrey, K. A., Sasson, N. J., Reznick, J. S., Paul, G., Goldman, B. D., & Piven, J. (2002). Visual scanning of faces in autism. Journal of Autism and Developmental Disorders, 32(4), 249261. https://doi.org/10.1023/a:1016374617369 Google ScholarPubMed
Philip, R. C. M., Whalley, H. C., Stanfield, A. C., Sprengelmeyer, R., Santos, I. M., Young, A. W., Atkinson, A. P., Calder, A. J., Johnstone, E. C., Lawrie, S. M., & Hall, J. (2010). Deficits in facial, body movement and vocal emotional processing in autism spectrum disorders. Psychological Medicine, 40(11), 19191929. https://doi.org/10.1017/S0033291709992364 CrossRefGoogle ScholarPubMed
Piggot, J., Kwon, H., Mobbs, D., Blasey, C., Lotspeich, L., Menon, V., Bookheimer, S., & Reiss, A. L. (2004). Emotional attribution in high-functioning individuals with autistic spectrum disorder: A functional imaging study. Journal of the American Academy of Child and Adolescent Psychiatry, 43(4), 473480. https://doi.org/10.1097/00004583-200404000-00014 CrossRefGoogle ScholarPubMed
Poljac, E., Poljac, E., & Wagemans, J. (2013). Reduced accuracy and sensitivity in the perception of emotional facial expressions in individuals with high autism spectrum traits. Autism, 17(6), 668680. https://doi.org/10.1177/1362361312455703 Google ScholarPubMed
Poquérusse, J., Pastore, L., Dellantonio, S., & Esposito, G. (2018). Alexithymia and autism spectrum disorder: A complex relationship. Frontiers in Psychology, 9, 1196. https://doi.org/10.3389/fpsyg.2018.01196 CrossRefGoogle ScholarPubMed
Preacher, K. J., & Hayes, A. F. (2004). SPSS and SAS procedures for estimating indirect effects in simple mediation models. Behavior Research Methods, Instruments & Computers, 36(4), 717731. https://doi.org/10.3758/BF03206553 Google ScholarPubMed
Quattrocki, E., & Friston, K. (2014). Autism, oxytocin and interoception. Neuroscience and Biobehavioral Reviews, 47, 410430. https://doi.org/10.1016/j.neubiorev.2014.09.012 Google ScholarPubMed
Rosset, D. B., Rondan, C., Da Fonseca, D., Santos, A., Assouline, B., & Deruelle, C. (2008). Typical emotion processing for cartoon but not for real faces in children with autistic spectrum disorders. Journal of Autism and Developmental Disorders, 38(5), 919925. https://doi.org/10.1007/s10803-007-0465-2 Google Scholar
Rozema, R. (2015). Manga and the autistic mind. English Journal, 105(1), 6068.Google Scholar
Rucker, D. D., Preacher, K. J., Tormala, Z. L., & Petty, R. E. (2011). Mediation analysis in social psychology: Current practices and new recommendations. Social and Personality Psychology Compass, 5(6), 359371. https://doi.org/10.1111/j.1751-9004.2011.00355.x CrossRefGoogle Scholar
Rudolph, C. E. S., Lundin, A., Åhs, J. W., Dalman, C., & Kosidou, K. (2018). Brief report: Sexual orientation in individuals with autistic traits: Population based study of 47,000 adults in Stockholm county. Journal of Autism and Developmental Disorders, 48(2), 619624. https://doi.org/10.1007/s10803-017-3369-9 Google ScholarPubMed
Rump, K. M., Giovannelli, J. L., Minshew, N. J., & Strauss, M. S. (2009). The development of emotion recognition in individuals with autism. Child Development, 80(5), 14341447. https://doi.org/10.1111/j.1467-8624.2009.01343.x CrossRefGoogle ScholarPubMed
Ruzich, E., Allison, C., Smith, P., Watson, P., Auyeung, B., Ring, H., & Baron-Cohen, S. (2016). Subgrouping siblings of people with autism: Identifying the broader autism phenotype. Autism Research, 9(6), 658665. https://doi.org/10.1002/aur.1544 Google ScholarPubMed
Santiesteban, I., Gibbard, C., Drucks, H., Clayton, N., Banissy, M. J., & Bird, G. (2021). Individuals with autism share others’ emotions: Evidence from the Continuous Affective Rating and Empathic Responses (CARER) task. Journal of Autism and Developmental Disorders, 51(2), 391404. https://doi.org/10.1007/s10803-020-04535-y Google ScholarPubMed
Shane, H. C., & Albert, P. D. (2008). Electronic screen media for persons with autism spectrum disorders: Results of a survey. Journal of Autism and Developmental Disorders, 38(8), 14991508. https://doi.org/10.1007/s10803-007-0527-5 CrossRefGoogle ScholarPubMed
Shrout, P. E., & Bolger, N. (2002). Mediation in experimental and nonexperimental studies: New procedures and recommendations. Psychological Methods, 7(4), 422445. https://doi.org/10.1037/1082-989X.7.4.422 CrossRefGoogle ScholarPubMed
Silani, G., Bird, G., Brindley, R., Singer, T., Frith, C., & Frith, U. (2008). Levels of emotional awareness and autism: An fMRI study. Social Neuroscience, 3(2), 97112. https://doi.org/10.1080/17470910701577020 Google ScholarPubMed
Silva, C., Da Fonseca, D., Esteves, F., & Deruelle, C. (2015). Motivational approach and avoidance in autism spectrum disorder: A comparison between real photographs and cartoons. Research in Autism Spectrum Disorders, 17, 1324. https://doi.org/10.1016/j.rasd.2015.05.004 Google Scholar
Sizoo, B. B., Horwitz, E. H., Teunisse, J. P., Kan, C. C., Vissers, C. T. W., Forceville, E. J. M., Van Voorst, A. J. P., & Geurts, H. M. (2015). Predictive validity of self-report questionnaires in the assessment of autism spectrum disorders in adults. Autism, 19(7), 842849. https://doi.org/10.1177/1362361315589869 CrossRefGoogle ScholarPubMed
Song, Y., & Hakoda, Y. (2018). Selective impairment of basic emotion recognition in people with autism: Discrimination thresholds for recognition of facial expressions of varying intensities. Journal of Autism and Developmental Disorders, 48(6), 18861894. https://doi.org/10.1007/s10803-017-3428-2 Google ScholarPubMed
South, M., Ozonoff, S., & McMahon, W. M. (2005). Repetitive behavior profiles in Asperger syndrome and high-functioning autism. Journal of Autism and Developmental Disorders, 35(2), 145158. https://doi.org/10.1007/s10803-004-1992-8 CrossRefGoogle ScholarPubMed
Stuart, N., Whitehouse, A., Palermo, R., Bothe, E., & Badcock, N. (2023). Eye gaze in autism spectrum disorder: A review of neural evidence for the eye avoidance hypothesis. Journal of Autism and Developmental Disorders, 53(5), 18841905. https://doi.org/10.1007/s10803-022-05443-z CrossRefGoogle ScholarPubMed
Sucksmith, E., Allison, C., Baron-Cohen, S., Chakrabarti, B., & Hoekstra, R. A. (2013). Empathy and emotion recognition in people with autism, first-degree relatives, and controls. Neuropsychologia, 51(1), 98105. https://doi.org/10.1016/j.neuropsychologia.2012.11.013 CrossRefGoogle ScholarPubMed
Taylor, E. C., Livingston, L. A., Clutterbuck, R. A., & Shah, P. (2020). Psychometric concerns with the 10-item Autism-Spectrum Quotient (AQ10) as a measure of trait autism in the general population. Experimental Results, 1, e3. https://doi.org/10.1017/exp.2019.3 CrossRefGoogle Scholar
Taylor, G. J., Bagby, R. M., & Parker, J. D. (1992). The Revised Toronto Alexithymia Scale: Some reliability, validity, and normative data. Psychotherapy and Psychosomatics, 57(1-2), 3441. https://doi.org/10.1159/000288571 CrossRefGoogle ScholarPubMed
Taylor, G. J., Bagby, R. M., & Parker, J. D. (1997). Disorders of affect regulation: Alexithymia in medical and psychiatric illness. Cambridge University Press, https://doi.org/10.1017/CBO9780511526831 CrossRefGoogle Scholar
Tottenham, N., Hertzig, M. E., Gillespie-Lynch, K., Gilhooly, T., Millner, A. J., & Casey, B. J. (2014). Elevated amygdala response to faces and gaze aversion in autism spectrum disorder. Social Cognitive and Affective Neuroscience, 9(1), 106117. https://doi.org/10.1093/scan/nst050 Google ScholarPubMed
Tracy, J. L., Robins, R. W., Schriber, R. A., & Solomon, M. (2011). Is emotion recognition impaired in individuals with autism spectrum disorders? Journal of Autism and Developmental Disorders, 41(1), 102109. https://doi.org/10.1007/s10803-010-1030-y Google ScholarPubMed
Trevisan, D. A., Bowering, M., & Birmingham, E. (2016). Alexithymia, but not autism spectrum disorder, may be related to the production of emotional facial expressions. Molecular Autism, 7(1), 46. https://doi.org/10.1186/s13229-016-0108-6 Google ScholarPubMed
Uljarevic, M., & Hamilton, A. (2013). Recognition of emotions in autism: A formal meta-analysis. Journal of Autism and Developmental Disorders, 43(7), 15171526. https://doi.org/10.1007/s10803-012-1695-5 CrossRefGoogle ScholarPubMed
Vettori, S., Dzhelyova, M., Van der Donck, S., Jacques, C., Van Wesemael, T., Steyaert, J., Rossion, B., & Boets, B. (2020). Combined frequency-tagging EEG and eye tracking reveal reduced social bias in boys with autism spectrum disorder. Cortex, 125, 135148. https://doi.org/10.1016/j.cortex.2019.12.013 Google Scholar
Wallace, S., Coleman, M., & Bailey, A. (2008). An investigation of basic facial expression recognition in autism spectrum disorders. Cognition and Emotion, 22(7), 13531380. https://doi.org/10.1080/02699930701782153 CrossRefGoogle Scholar
Webb, M. A., & Tangney, J. P. (2024). Too good to be true: Bots and bad data from Mechanical Turk. Perspectives on Psychological Science, 19(6), 887890. https://doi.org/10.1177/17456916221120027 CrossRefGoogle ScholarPubMed
Wilhelm, O., Hildebrandt, A., Manske, K., Schacht, A., & Sommer, W. (2014). Test battery for measuring the perception and recognition of facial expressions of emotion. Frontiers in Psychology, 5, 404. https://doi.org/10.3389/fpsyg.2014.0040 Google ScholarPubMed
Yeung, M. K. (2022). A systematic review and meta-analysis of facial emotion recognition in autism spectrum disorder: The specificity of deficits and the role of task characteristics. Neuroscience and Biobehavioral Reviews, 133, 104518. https://doi.org/10.1016/j.neubiorev.2021.104518 Google ScholarPubMed
Zhao, X., Lynch, J. G. Jr., & Chen, Q. (2010). Reconsidering Baron and Kenny: Myths and truths about mediation analysis. Journal of Consumer Research, 37(2), 197206. https://doi.org/10.1086/651257 CrossRefGoogle Scholar
Figure 0

Table 1. Means, standard deviations, and correlations among study variables

Figure 1

Figure 1. Correlations of autistic traits with performance on the human (left) and anime (right) facial emotion recognition tests. Note. N = 247. Points represent individual participants. Shaded regions represent 95% confidence intervals. AQ-10 = Short Autism Spectrum Quotient (Allison et al., 2012).

Figure 2

Figure 2. Correlations of alexithymia with performance on the human (left) and anime (right) facial emotion recognition tests. Note. N = 247. Points represent individual participants. Shaded regions represent 95% confidence intervals. TAS-R = Revised Toronto Alexithymia Scale (Taylor et al., 1992).

Figure 3

Table 2. Hierarchical multiple regression results for age, frequency of social interaction, frequency of anime or manga use, autistic traits, and alexithymia predicting human and anime facial emotion recognition scores

Figure 4

Figure 3. Path diagram for the mediation analysis in which autistic traits predicted human facial emotion recognition scores through alexithymia. Note. N = 247. Standardized regression coefficients are depicted for the relationship between autistic traits and performance on the human facial emotion recognition test as mediated by alexithymia. The standardized regression coefficient between autistic traits and human facial emotion recognition scores, controlling for alexithymia, is in parentheses. Brackets indicate 95% confidence intervals for each standardized regression coefficient. AQ-10 = Short Autism Spectrum Quotient (Allison et al., 2012); TAS-R = Revised Toronto Alexithymia Scale (Taylor et al., 1992). *p < .0001.

Figure 5

Figure 4. Path diagram for the mediation analysis in which autistic traits predicted anime facial emotion recognition scores through alexithymia. Note. N = 247. Standardized regression coefficients are depicted for the relationship between autistic traits and performance on the anime facial emotion recognition test as mediated by alexithymia. The standardized regression coefficient between autistic traits and anime facial emotion recognition scores, controlling for alexithymia, is in parentheses. Brackets indicate 95% confidence intervals for each standardized regression coefficient. AQ-10 = Short Autism Spectrum Quotient (Allison et al., 2012); TAS-R = Revised Toronto Alexithymia Scale (Taylor et al., 1992). *p < .0001.

Supplementary material: File

Standiford and Hsu supplementary material

Standiford and Hsu supplementary material
Download Standiford and Hsu supplementary material(File)
File 72.5 KB