Hostname: page-component-7bb8b95d7b-l4ctd Total loading time: 0 Render date: 2024-10-06T04:17:50.561Z Has data issue: false hasContentIssue false

Children's interactions with virtual assistants: Moving beyond depictions of social agents

Published online by Cambridge University Press:  05 April 2023

Lauren N. Girouard-Hallam
Affiliation:
Department of Psychological and Brain Sciences, University of Louisville, Louisville, KY 40292, USA [email protected] [email protected] http://louisvillekidstudies.org
Judith H. Danovitch
Affiliation:
Department of Psychological and Brain Sciences, University of Louisville, Louisville, KY 40292, USA [email protected] [email protected] http://louisvillekidstudies.org

Abstract

Clark and Fischer argue that people see social robots as depictions of social agents. However, people's interactions with virtual assistants may change their beliefs about social robots. Children and adults with exposure to virtual assistants may view social robots not as depictions of social agents, but as social agents belonging to a unique ontological category.

Type
Open Peer Commentary
Copyright
Copyright © The Author(s), 2023. Published by Cambridge University Press

In their article, Clark and Fischer (C&F) state: “It is one thing to tacitly distinguish the three perspectives on a robot (a matter of cognition) and quite another to answer questions about them (a matter of meta-cognition)” (target article, sect. 4.5, para. 1). In supporting their theory that it may be difficult for some people to think through their own conceptualizations of social agents, C&F reference Kahn et al.'s (Reference Kahn, Kanda, Ishiguro, Freier, Severson, Gill and Shen2012) study where children ages 9–15 were asked questions about a socially contingent robot called Robovie. They argue that the language used in the study may obscure Robovie's status as a depiction of a social entity and explain why children struggled to categorize Robovie. Although we agree that prompting children to think about the ontology of social robots poses challenges, we also believe that taking a developmental perspective when considering social robots may lead to a different interpretation altogether: This generation of children do not view social robots as representations of social beings, but rather, as Kahn et al. (Reference Kahn, Kanda, Ishiguro, Freier, Severson, Gill and Shen2012) posited, they view social robots as belonging to a new ontological category. Although C&F state that “It is an open question what children understand about social robots at each age” (target article, sect. 6.4, para. 2), we propose that recent research on children's understanding of virtual assistants provides valuable insight into how children construe social robots.

Nearly half of American parents of children under age 9 indicate that they have at least one virtual assistant in their home (Rideout & Robb, Reference Rideout and Robb2020), meaning that these devices are far more likely to be familiar to children than even the most popular social robots. Virtual assistants are interactive and conversational and behave in socially contingent ways. Recent research suggests that children as young as age 4 can effectively interact with virtual assistants (e.g., Lovato & Piper, Reference Lovato and Piper2015; Lovato, Piper, & Wartella, Reference Lovato, Piper and Wartella2019; Oranç & Ruggeri, Reference Oranç and Ruggeri2021; Xu & Warschauer, Reference Xu and Warschauer2020) and, by age 7, children view them as reliable information sources (Girouard-Hallam & Danovitch, Reference Girouard-Hallam and Danovitch2022). Moreover, children ascribe both artifact and non-artifact characteristics to these devices. Children ages 6–10 attribute mental characteristics like intelligence, social characteristics like the capacity for friendship, and some moral standing to a familiar virtual assistant (Girouard-Hallam, Streble, & Danovitch, Reference Girouard-Hallam, Streble and Danovitch2021), but they also hold that virtual assistants cannot breathe and are not alive (Girouard-Hallam & Danovitch, Reference Girouard-Hallam and Danovitch2022).

Thus, similar to the children in Kahn et al.'s (Reference Kahn, Kanda, Ishiguro, Freier, Severson, Gill and Shen2012) Robovie study, children do not treat virtual assistants entirely like other humans nor like inanimate objects. Instead, children may view them as belonging to a new ontological category that occupies its own niche between person and artifact (e.g., Kahn, Gary, & Shen, Reference Kahn, Gary and Shen2013; Kahn & Shen, Reference Kahn, Shen, Budwig, Turiel and Zelazo2017; Severson & Carlson, Reference Severson and Carlson2010). In a study examining children's ontological beliefs about virtual assistants, Festerling and Siraj (Reference Festerling and Siraj2020) found that 6–10-year-old children had clear ontological beliefs about humans and artifacts, but children believed that virtual assistants possessed human and artifact features simultaneously. Thus, children view virtual assistants as a unique entity rather than as a mechanical depiction of a non-unique entity, such as a person. Contrary to C&F's arguments that people view social robots as non-real facsimiles of real social agents by engaging with them and then appreciating their qualities (the dual-layer argument; target article, sect. 6.4, para. 2), and that children in particular treat robots “as interactive toys – as props in make-believe social play” (target article, sect. 6.4, para. 1), children appear to believe that virtual assistants are at once animate and inanimate, rather than separating these entities into a real structure and an imaginary depiction.

Children's para-social partnerships with virtual assistants further contribute to the idea that children view virtual assistants as a new ontological category, occupying a unique space between artifact and person. Para-social relationships are emotionally tinged and one-sided, and they commonly occur between children and media characters, such as characters from popular television shows (Richards & Calvert, Reference Richards, Calvert, Barr and Linebarger2017). Parents report that their young children form para-social relationships with virtual assistants and that these relationships result from children's exposure to these socially contingent devices (Hoffman, Owen, & Calvert, Reference Hoffman, Owen and Calvert2021). Thus, it seems that the more time children spend with virtual assistants, which can respond and engage in conversation with them, the more likely they are to believe that virtual assistants are companions that care for them and that should be cared for in turn. Similarly, there is evidence that children treat virtual assistants as trusted social partners, and benefit from pedagogical exchanges with them similar to the ones they have with human partners (Xu et al., Reference Xu, Wang, Collins, Lee and Warschauer2021). C&F use Fischer's (Reference Fischer2016) hypothesis that some people are “players” and some are “non-players” to explain that “not everyone is willing to play along with a robot – or to do so all the time” (target article, sect. 7.2, para. 7). We propose that children who regularly interact with virtual assistants accrue a willingness to engage as “players” with these devices, which by extension changes the way that they view them and might change the way they view social robots as well.

In conclusion, as this generation of children grows up with virtual assistants and similar devices, and virtual assistants occupy an increasing part in adults' day-to-day lives, it will be necessary to re-evaluate C&F's stance. Interactions with virtual assistants may reveal a more complex general relationship between humans and robots than C&F claim. It may be that rather than viewing social robots as depictions of social agents, children and adults who have experience with virtual assistants instead view them as semi-social agents. In other words, they may view social robots not as a composite of several parts, but rather as a unique assemblage of human and artifact characteristics. Additional empirical research that takes a developmental approach to examining the conversations and interactions people have with virtual assistants could aid in testing C&F's hypothesis that “people construe social robots not as agents per se, but as depictions of agents” (target article, sect. 1, para. 3). A developmental and ontological perspective on social robots may move the conversation beyond mere depiction to a deeper understanding of the role social robots play in our daily lives and how we view them in turn.

Financial support

This research received no specific grant from any funding agency, commercial, or not-for-profit sectors.

Competing interest

None.

References

Festerling, J., & Siraj, I. (2020). Alexa, what are you? Exploring primary school children's ontological perceptions of digital voice assistants in open interactions. Human Development, 64(1), 2643. https://doi.org/10.1159/000508499CrossRefGoogle Scholar
Fischer, K. (2016). Designing speech for a recipient: The roles of partner modeling, alignment and feedback in so-called ‘simplified registers' (Vol. 270). John Benjamins.10.1075/pbns.270CrossRefGoogle Scholar
Girouard-Hallam, L. N., & Danovitch, J. H. (2022). Children's trust in and learning from voice assistants. Developmental Psychology, 58(4), 646. https://doi.org/10.1037/dev0001318CrossRefGoogle ScholarPubMed
Girouard-Hallam, L. N., Streble, H. M., & Danovitch, J. H. (2021). Children's mental, social, and moral attributions toward a familiar digital voice assistant. Human Behavior and Emerging Technologies, 3(5), 11181131. https://doi.org/10.1002/hbe2.321CrossRefGoogle Scholar
Hoffman, A., Owen, D., & Calvert, S. L. (2021). Parent reports of children's parasocial relationships with conversational agents: Trusted voices in children's lives. Human Behavior and Emerging Technologies, 3(4), 112. https://doi.org/10.1002/hbe2.271CrossRefGoogle Scholar
Kahn, P. H., Gary, H. E., & Shen, S. (2013). Children's social relationships with current and near-future robots. Child Development Perspectives, 7, 3237. https://doi.org/10.1111/cdep.12011CrossRefGoogle Scholar
Kahn, P. H., Kanda, T., Ishiguro, H., Freier, N. G., Severson, R. L., Gill, B. T., … Shen, S. (2012). “Robovie, you'll have to go inside the closet now”: Children's social and moral relationships with a humanoid robot. Developmental Psychology, 48, 303314. https://doi.org/10.1037/a0027033CrossRefGoogle ScholarPubMed
Kahn, P. H., & Shen, S. (2017). NOC NOC, who's there? A new ontological category (NOC) for social robots. In Budwig, N., Turiel, E., & Zelazo, P. D. (Eds.), New perspectives on human development (pp. 106122). Cambridge University Press. https://doi.org/10.1017/CBO9781316282755.008CrossRefGoogle Scholar
Lovato, S., & Piper, A. M. (2015). “Siri, Is This You?” Understanding Young Children's Interactions with voice Input Systems. In Proceedings of the 14th ACM International Conference on Interaction Design and Children, IDC 2015, Medford, MA, USA, pp. 335–338. ACM.Google Scholar
Lovato, S. B., Piper, A. M., & Wartella, E. A. (2019). “Hey Google, Do Unicorns Exist?”: Conversational Agents as a Path to Answers to Children's Questions. In Proceedings of the 18th ACM International Conference on Interaction Design and Children, IDC 2019, Boise, ID, USA, pp. 301–313. ACM. https://doi.org/10.1145/3311927.3323150CrossRefGoogle Scholar
Oranç, C., & Ruggeri, A. (2021). “Alexa, let me ask you something different”: Children's adaptive information search with voice assistants. Human Behavior and Emerging Technologies, 3(4), 111. https://doi.org/10.1002/hbe2.270Google Scholar
Richards, M. N., & Calvert, S. L. (2017). Media characters, parasocial relationships, and the social aspects of children's learning across media platforms. In Barr, R. & Linebarger, D (Eds.), Media exposure during infancy and early childhood: The effects of content and context on learning and development (pp. 141163). Springer International Publishing/Springer Nature. https://doi.org/10.1007/978-3-319-45102-2_9CrossRefGoogle Scholar
Rideout, V., & Robb, M. (2020). The common sense census: Media use by kids age zero to eight. Common Sense Media. Retrieved from https://www.commonsensemedia.org/sites/default/files/uploads/research/2020_zero_to_eight_census_final_web.pdfGoogle Scholar
Severson, R. L., & Carlson, S. M. (2010). Behaving as or behaving as if? Children's conceptions of personified robots and the emergence of a new ontological category. Neural Networks, 23(8–9), 10991103. https://doi.org/10.1016/j.neunet.2010.08.014CrossRefGoogle ScholarPubMed
Xu, Y., Wang, D., Collins, P., Lee, H., & Warschauer, M. (2021). Same benefits, different communication patterns: Comparing children's reading with a conversational agent vs. a human partner. Computers & Education, 161, 104059. https://doi.org/10.1016/j.compedu.2020.104059CrossRefGoogle Scholar
Xu, Y., & Warschauer, M. (2020). What Are You Talking To?: Understanding Children's Perceptions of Conversational Agents. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, Honolulu, HI, USA, pp. 1–13. Association for Computing Machinery. https://doi.org/10.1145/3313831.3376416CrossRefGoogle Scholar