Children & Artificial Intelligence

blog post image featuring a small robot

Introduction & Methods

This literature review examines the existing research on artificial intelligence (or, AI) in media and human relations. Although it is used in a variety of contexts and applications in various industries, the focus of this literature review will be on AI use cases within entertainment and learning applications accessible through mainstream media formats (television, computer, etc.). Peer-reviewed literature and books were evaluated for this review and found through database searches, including PubMed, Google Scholar, and the Harvard University Hollis Library database; search-specific alerts to new literature; and using literature cited in other relevant materials. This is not a systematic review that evaluates all literature available, but is a focused review that contains work chosen and considered through the discretion of the research team and the lead literature review author as most applicable to the specific focus of this brief review.

This review explores the inclusion of AI within media designed for children and how children are affected by this technology. First, this review will explore the development of parasocial relationships and how those relationships have expanded with the introduction of intelligent media characters and toys. Then this work will explore digital assistants, a specific physical application of artificial intelligence, and how they are used by children and their families. These devices are extremely common in United States households, and understanding how youth interact with them is of critical importance for understanding a growing segment of human-computer interaction.

It is important to note that, while chatbot artificial intelligence agents like ChatGPT are quickly gaining traction and are becoming widely covered by news media outlets, research regarding how children and adolescents use these applications for entertainment is extremely limited, and therefore not covered in this literature review.


Parasocial Relationships – Young Children, Media, and Toy Characters

Parasocial relationships can be defined as one-way emotional attachments between a viewer or user and a media character or artificially intelligent technology (Horton & Wohl, 1956). While the technology landscape has changed in recent years to include artificially intelligent devices with which children and adolescents may form parasocial relationships, this relationship has already been studied extensively in children and adolescents with respect to television and video game characters.

By surveying parents about their young children’s (6 mo to 8y) favorite characters and the children’s feeling about those characters, researchers were able to elucidate three factors that mediate the development of a parasocial relationship between a young child and a non-human character: 1) attachment; 2) character personification, and 3) social realism (Bond & Calvert, 2014). Attachment is the sense of security and comfort children feel when presented with the character, much like a child would seek support from a trusted adult. Character personification is the attributing of human qualities and personification of media characters, and social realism is a child’s perception that the media character is real and can exist in “real life”. Later research identified a fourth factor for parasocial relationship development: humanlike needs, or, a child’s belief that their favorite character can get hungry or sleepy, for example. (Richards & Calvert, 2016).

More About the Methods: Calvert et al., 2020

The research team created a semi-intelligent character prototype that was used in three studies discussed in the article. The character of Dora was not artificially intelligent; instead the study used a “Wizard of Oz” method, where the researcher was able to provide the socially contingent responses to the child who was participating. In the studies, 4-6 year old children interacted with Dora during a math game which sought to provide educational content about the +1 rule. For the studies and analysis procedure the article notes: “In Study 1, children’s learning was evaluated based on their parasocial relationships and parasocial interactions with the character, focusing on how quickly they answered add-1 problems correctly during the game. In Study 2, children’s parasocial relationships for learning were examined by comparing the intelligent character to an intelligent no character control version of the game, while keeping meaningful parasocial interaction prompts constant. In Study 3, children’s parasocial relationship were held constant using the same character in both conditions, but children’s parasocial interactions were manipulated using socially contingent or noncontingent character replies. Add-1 transfer problems were included to measure flexibility in moving from virtual to physical contexts in Studies 2 and 3. A robustness analysis was conducted on latency scores as a function of parasocial relationship and parasocial interaction scores by comparing the performance of children from Study 1 to a combined sample from Studies 2 and 3.”

The same research also resulted in a descriptive model of how young children’s parasocial relationships are developed. These relationships are supported by parent/caregiver encouragement; playing with toys associated with the characters; parasocial interactions (a single defined instance of media exposure to the character); and repeated exposure to the media featuring the character(s) (Bond & Calvert, 2014).

Research evaluating educational applications of parasocial relationships has shown that a child’s emotional connection to and familiarity with a character can improve task learning (Howard Gola et al., 2013; Lauricella et al., 2011). As technological capabilities have increased, a new interest in this field of research has emerged: contingency and intelligent characters (Brunick et al., 2015).

Artificial intelligence and innovative programming have allowed researchers to explore if the contingency, or reciprocation of speech, of a character with which a child has a parasocial interaction has an effect on a child’s learning and attention. One such article created an intelligent prototype of the popular character Dora the Explorer. Three studies presented in the publication found that social contingency, and the strength of a child’s feelings of attachment and friendship with the character, mediated performance on the learning task. The level of familiarity with Dora predicted faster response times to her questions across all three studies, and when Dora was socially contingent vs. noncontingent, children produced more math talk. However, the authors noted an important caveat of the findings – children performed just as quickly on the math problems during the game when there was an intelligent character or just a socially contingent voiceover present, highlighting the important role of general social contingency for young children’s learning (Calvert et al., 2020). A limitation of this work is that children were tested about the learned mathematical skill just after completing the experimental procedure, so durability of knowledge acquisition is unknown from these experiments.

Other recent research has found similar positive effects for learning with characters who are artificially intelligent. Converse to Learn, a project led by Ying Xu and Mark Warschauer, has pioneered work using what they refer to as “dialogic characters powered by artificial intelligence”. Familiar characters like Elinor from PBS’ programming have been modified to become intelligent characters that can interact with and respond to children (Xu et al., 2022a; Xu et al., 2022b; Xu et al., 2023). Communication strategies which support effective communication with children were embedded into the conversational Elinor character. The researchers ensured Elinor would ask questions, offer feedback on the topic, and provide scaffolding. In addition, Elinor was designed to address NGSS’s (Next Generation Science Standard) scientific learning goals in her questioning (Xu et al., 2022a).

A usability study found that children responded to 92.8% of Elinor’s questions, and 87.8% of those responses were on-topic. Additionally, both parents and children expressed positive perceptions of the program (Xu et al., 2022a). Further investigations using the Elinor conversational episodes have shown that children who had contingent interaction with Elinor (that is, Elinor responded to the children in real time and with appropriate feedback to their response to her question) performed better on a science posttest than those who watched an episode of the show without any interaction from Elinor (Xu et al., 2022b) or a pseudo-interactive Elinor, although the latter finding was only marginally significant, requiring further study (Xu et al., 2023). Continuing research from Converse to Learn will seek to expand the conversational character’s comprehensive abilities to understand users who may be less fluent in English and who may respond to Elinor’s questions in Spanish (Xu et al., 2023).

Technological advances have also allowed for the development of “smart toys” which incorporate AI and interact with their young users. While this may include toys which are clearly robotic in appearance, other toys like stuffed animals and dolls may also be integrated with AI capabilities. Research has shown that children believe that their robot playmates have mental states like feelings, could be social companions, and should be treated with respect and fairness (Francis & Mishra, 2009; Kahn et al., 2012; Westlund et al., 2018). Children may also alter their actions in order to maintain a positive reputation around social robots (Okumura et al., 2023). Although not fully conversational, early work with a smart toy doll showed that the doll could influence children’s responses to questions regarding moral judgements like whether it was okay to tease another child, but not to disobey instructions to not open a box before a certain amount of time passed (Williams et al., 2018).

Additionally, a child’s age may influence their perceptions about a robot or smart toy. Younger children indicate a greater likelihood to allow their play and experience with the toy to shape their understanding of its anthropomorphic qualities like intelligence and social capabilities rather than having preconceived ideas about its qualities due to its technological appearance or description (Druga et al., 2018; Kahn et al., 2012). Social robots and smart toys are growing in complexity and availability, so relationships between children and these entities should continue to be monitored for understanding as the technology evolves (see Kahn et al., 2013).

A developing topic in artificial intelligence research is the idea of a virtual influencer. Research has shown that tweens and adolescents can have parasocial relationships with their favorite influencers on social media (Bond, 2016; Sedmak & Svetina 2023; Tolbert & Drogos, 2019). A scientific understanding of how parasocial relationships may form with these non-human influencers is lacking, with recent work beginning to explore this in young adults (ex. Rossi & Rivetti, 2023; Stein et al., 2022), but further research is needed to understand more about older children and adolescents, who are often targets of influencer marketing and are widespread users of social media.


Introduction to Interactions With Digital Assistants

Digital assistants, also commonly referred to as “smart speakers” or “virtual assistants” are becoming increasingly prevalent in homes. Common Sense Media found that, in 2020, 41% of surveyed homes had a smart speaker, an increase of 32% from the previous survey in 2017. Additionally, children are using these devices quite frequently; 25% of children aged 0-8 often or sometimes interact with these devices, most commonly to play music or “talk/fool around with” them (Rideout & Robb, 2021).

More About the Methods: Lovato et al., 2019

Google Home Mini devices were provided for use in family homes (who did not previously have a smart speaker) with children age 5-6 years old. Two home visits were part of the study. During the first visit, after consent was obtained, the researcher collected demographic information and the child’s reading level, and provided information about how to use the device. Families were told that the researchers would have access to a log of their interactions with the device. Use was monitored remotely by the research team during the 2 to 3 weeks of the study (average 14.28 days). At the end of the period, child and parent pairs were interviewed in their home about their experience with the device and asked questions which the child rated on a six-point smiley-o-meter scale; questions included whether or not the child thought the device was friendly, smart, alive, trustworthy, etc. The data from the device was transcribed and  analyzed to determine usage and types of interactions with the device.

When young children (between 5 and 6yo) are talking to these devices, Lovato et. al. (2019) found that they are most frequently asking questions relating to science and technology, culture, or practical questions (like the current weather). These conversational moments are of particular interest for understanding the social dynamics between children and these devices. Research before the widespread adoption of these devices showed that children can often distinguish between living and non-living things, but can still prescribe psychological qualities to non-living objects like robots (Jipson & Gelman, 2007). This has implications for what children believe about these digital assistants, if they do indeed perceive these devices in similar ways to robots and smart toys. Recent research has sought to expand this idea to understand the relational qualities of children and virtual assistants.

Lovato et. al.’s experiment (2019) also asked children about their conceptions of the smart speaker they were using. While many children did not give the speaker a name and described it as a speaker, many did attribute human actions to the speaker. When asked how the device knows the answers to the questions they are asking, most described the speaker as “looking up” the answer on something like a smartphone or a search engine online. Eleven of the 18 children described the device as alive because it could talk or sounded like a person (Lovato et al., 2019).

In another study of children aged 6-10 years old, 93% of participants said that a digital assistant they were familiar with was smart, and 65% responded that the device can be a friend. However, 83% agreed that it is okay for someone to sell the digital assistant, furthering evidence from other research that children’s understanding of digital assistants is nuanced in relation to various attributions of sentience and autonomy (Girouard-Hallam et al., 2021; for further examples of nuanced interpretations see Lovato et al., 2019 and Xu and Warschauer, 2020).

In another similarity to robot or smart toy attribute interpretation, there is evidence that age plays a role in a child’s understanding of digital assistants, with younger children prescribing more human-like qualities or more frequently believing the digital assistant to be human (ex. Andries & Robertson, 2023; Girouard-Hallam et al., 2021; Xu & Warschauer, 2020) In Girourard-Hallam et al. (2021), for example, exact age, but not experience with the internet, predicted whether children would want to spend time with the device when lonely, with younger children espousing that preference more frequently. Garg & Sengupta (2020) note in their research that young children (5-7) were most likely to ascribe person-like qualities and believe it had feelings, thoughts, and intentions, and older children (7-15) often tested the intelligence of these devices, but believed them helpful for studying, managing schedules, and entertainment.

When exploring politeness toward digital assistants, preliminary research with adults and digital assistant “wakewords” which awaken the digital assistant, suggested that words and phrases that syntactically prime users to then use a full sentence after (ex. “Excuse me, Alexa”) may actually be more effective at increasing politeness than a phrase like “Alexa, please . . “ (Wen et al., 2023). Research with children is emerging and mixed; one study shows that many children believe that it is not acceptable to be rude to the device (Andries & Robertson, 2023), but other research has shown that children may be less likely to try and repair miscommunication with a voice assistant than with a voice-only human interaction and share less information with voice assistants about a task (Gampe et al., 2023). Social emotional learning (SEL) skills are already embedded into some digital assistants. However, preliminary analysis of these find them lacking in the ability to effectively instill SEL skills due to design attributes including minimal or non-existent meaningful contingent responses (Fu et al., 2022). A recently published review acknowledges there is concern that impolite exchanges between children and digital assistants may affect their communications with people, but notes “no scientific evidence can be considered statistically significant to provide a definite answer to the debate since experiments have been conducted on a small number of participants” (Ribino, 2023). Further research is needed to understand the relationship.

There is also recent evidence that children may form parasocial relationships with digital assistants in similar ways to their favorite media character or smart toy (Girouard-Hallam et al., 2021; Hoffman et al., 2021). While there is some evidence that parents may be concerned about this dynamic (ex. Garg & Sengupta, 2020) more research will be needed to understand the relationships between children and digital assistants and parental views of those parasocial connections.


Parenting, Family Dynamics, and Digital Assistants

Just as children and adults may develop personal beliefs and relationships with digital assistants, recent research is also seeking to understand how digital assistants may affect family dynamics. In addition, with the introduction of new technology into the family sphere, parents’ opinions of digital assistants and their hopes, and fears, about how digital assistants can support their child(ren)’s learning are also critical pieces of knowledge in order to evaluate the adoption and usage of these devices, especially as their adoption grows.

Safety, both physical and digital, is a major consideration of parents even before a smart device like a digital assistant is added to the home, and parents continue to assess developmental needs and usage patterns that affect how their children interact with home technology and therefore may affect safety even after it is integrated into their lives (Sun et al., 2021). In a study where parents viewed interactions between two children, a child and a robot, and a child and a digital assistant, privacy concerns were significantly higher in the digital assistant condition (Szczuka et al., 2021). Older children may also share these security and privacy concerns (Garg & Sengupta, 2020)

When a family does choose to use a digital assistant, its presence can affect familial interactions in a myriad of ways, both positive and negative. Scaffolding, also called fostering communication, is very common in family interactions with digital assistants. Parents and older children, for example, are shown to support younger children with using digital assistants effectively or helping a digital assistant understand something by repeating a family member’s request (Beirl et al., 2019; Beneteau et al., 2020). In terms of family cohesion, while the digital assistant’s presence resulted in times of shared laughter, it also sometimes exacerbated family conflict, like when trying to choose music to play and family members had different tastes, or having a family member cut off whatever the digital assistant was doing by telling it to stop (Bierl et al., 2019; Beneteau et al., 2020). Parents may also use digital assistants to augment their parenting and incorporate it into family processes that mediate behavior or support autonomy (Beneteau et al., 2020).

More About the Methods: Beirl et al., 2019 

Six families recruited from the South East UK were given an Alexa product to use at home over 3 weeks. A researcher visited each family for approximately an hour at the beginning and end of the study to review their experience; these visits were recorded on camera. The Alexa was programmed with three skills to try over the first week, and then families were able to download additional skills for the following two weeks. All of the interactions with Alexa were recorded and transcribed for analysis.

Parent interviews about their views of digital assistants supporting SEL skills found that while parents positively envisioned ways in which digital assistants could support family norms and encourage politeness, they also expressed concerns about the technology displacing human interactions, disrupting their parental relationship with their child, and teaching values misaligned with their own (Fu et al., 2022). Further research on SEL and digital assistants will be imperative to understand how families would prefer to use digital assistants in a learning context.


Takeaways and Further Questions

  • Children can form parasocial relationships (one-way emotional attachments) with media characters. Early research indicates that children may form similar relationships with AI-enabled digital assistants. Further research is needed to more deeply understand how these relationships can be exploited for developmental, educational, and wellbeing benefits.
  • Children’s attribution of intelligence and autonomy to AI-enabled toys, robots, and digital assistants is nuanced. Intelligence is often attributed to these devices but autonomy to do things like vote or not be owned is attributed far less often. Further study is needed to better understand these attributions and their implications.
  • While there is some research on how children and adolescents interact with chatbots, the focus tends to be on educational and medical capacities (ex. Thompson & Baranowski, 2019). Given the rapid expansion of public availability of these tools, further research is needed to understand how children and adolescents are interacting with chatbots for entertainment, and the impacts of these interactions on young users’ wellbeing and development.
  • With the rapid adoption of digital assistants and smart speakers in households around the globe, it is imperative to understand how children comprehend and critically examine information offered by these devices.
  • With increasing age, children are less likely to seek out devices when they are lonely and less likely to ascribe humanistic qualities to AI-enabled characters and devices. There is a suggestion that a shift happens around seven years old, due to children’s increasing abilities to distinguish between reality and fantasy at that developmental stage, but there hasn’t yet been a clear “cut off” determined where we can be reasonably certain about a child’s ability to understand the limitations of an AI character (and the implications of that shift).
  • There is recent evidence that using digital assistants within a family context can both help and hinder familial interactions. While digital assistants may support communication between parents, siblings, and young children learning how to interact with the device, they may also exacerbate or create tension by the nature of communal access. More research is needed in order to understand the complex needs of families and digital assistants, as well as to discover parental views about educational contexts of digital assistants.

This research brief was written by Kaitlin Tiches, MLIS, Medical Librarian and Knowledge Manager at the Digital Wellness Lab. For more information, please email us.


References

Andries, V., & Robertson, J. (2023). ‘Alexa doesn’t have that many feelings’: Children’s understanding of AI through interactions with smart speakers in their homes. ArXiv. https://doi.org/10.48550/arXiv.2305.05597

Beirl, D., Rogers, Y., & Yuill, N. (2019). Using voice assistant skills in family life. In Proceedings of the 13th International Conference on Computer Supported Collaborative Learning – CSCL 2019 (pp. 96–103).

Beneteau, E., Boone, A., Wu, Y., Kientz, J.A., Yip, J., & Hiniker, A. (2020). Parenting with Alexa: Exploring the Introduction of Smart Speakers on Family Dynamics.  In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-13). https://doi.org/10.1145/3313831.3376344

Bond, B. J. (2016). Following Your “Friend”: Social Media and the Strength of Adolescents’ Parasocial Relationships with Media Personae. Cyberpsychol Behav Soc Netw, 19(11), 656-660. https://doi.org/10.1089/cyber.2016.0355

Bond, B. J., & Calvert, S. L. (2014). A Model and Measure of US Parents’ Perceptions of Young Children’s Parasocial Relationships. Journal of Children and Media, 8(3), 286-304. https://doi.org/10.1080/17482798.2014.890948

Brunick, K. L., Putnam, M. M., McGarry, L. E., Richards, M. N., & Calvert, S. L. (2016). Children’s future parasocial relationships with media characters: the age of intelligent characters. Journal of Children and Media, 10(2), 181-190. https://doi.org/10.1080/17482798.2015.1127839

Calvert, S. L., Putnam, M. M., Aguiar, N. R., Ryan, R. M., Wright, C. A., Liu, Y. H. A., & Barba, E. (2020). Young Children’s Mathematical Learning From Intelligent Characters. Child Dev, 91(5), 1491-1508. https://doi.org/10.1111/cdev.13341

Druga, S., Williams, R., Park, H., & Breazeal, C. (2018). How smart are the smart toys?: Children and parents’ agent interaction and intelligence attribution. Proceedings of the 17th ACM Conference on Interaction Design and Children, 231–240. https://doi.org/10.1145/3202185.3202741

Francis, A., & Mishra, P. (2009). Is AIBO Real? Understanding Children’s Beliefs about and Behavioral Interactions with Anthropomorphic Toys. Journal of interactive learning research, 20(4), 405-422.

Fu, Y., Michelson, R., Lin, Y., Nguyen, L.K., Tayebi, T.J., & Hiniker, A. (2022). Social Emotional Learning with Conversational Agents: Reviewing Current Designs and Probing Parents’ Ideas for Future Ones. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technologies, 6(2), 1-23. https://doi.org/10.1145/3534622

Gampe, A., Zahner-Ritter, K., Müller, J. J., & Schmid, S. R. (2023). How children speak with their voice assistant Sila depends on what they think about her. Computers in human behavior, 143, 107693. https://doi.org/10.1016/j.chb.2023.107693

Garg, R., & Sengupta, S. (2020). “He Is Just Like Me: A Study of the Long-Term Use of Smart Speakers by Parents and Children. Proceedings of the ACM on Interactive, Mobile, Wearable and Ubiquitous Technology, 4(1), 1-24. https://doi.org/10.1145/3381002

Girouard‐Hallam, L. N., Streble, H. M., & Danovitch, J. H. (2021). Children’s mental, social, and moral attributions toward a familiar digital voice assistant. Human Behavior and Emerging Technologies, 3(5), 1118-1131. https://doi.org/10.1002/hbe2.321

Hoffman, A., Owen, D., & Calvert, S. (2021). Parent reports of children’s parasocial relationships with conversational agents: Trusted voices in children’s lives. Human Behavior and Emerging Technologies, 3(4), 808-617. https://doi.org/10.1002/hbe2.271

Horton, D., & Richard Wohl, R. (1956). Mass Communication and Para-Social Interaction. Psychiatry (Washington, D.C.), 19 (3), 215–229. https://doi.org/10.1080/00332747.1956.11023049

Howard Gola, A. A., Richards, M. N., Lauricella, A. R., & Calvert, S. L. (2013). Building Meaningful Parasocial Relationships Between Toddlers and Media Characters to Teach Early Mathematical Skills. Media Psychology, 16(4), 390-411. https://doi.org/10.1080/15213269.2013.783774

Kahn, P.H., Gary, H.E, & Shen, S. (2013). Children’s Social Relationships With Current and Near-Future Robots. Child Development Perspectives, 7(1), 32-37. https://doi.org/10.1111/cdep.12011

Kahn, P.H., Kanda, T., Ishiguro, H., Freier, N.G., Severson, R.L., Gill, B.T., Ruckert, J.H., & Shen, S. (2012). “Robovie, You’ll Have to Go into the Closet Now”: Children’s Social and Moral Relationships With a Humanoid Robot. Developmental psychology, 48(2), 303-314. https://doi.org/10.1037/a0027033

Lauricella, A. R., Gola, A. A. H., & Calvert, S. L. (2011). Toddlers’ Learning From Socially Meaningful Video Characters. Media Psychology, 14(2), 216-232. https://doi.org/10.1080/15213269.2011.573465

Lovato, S., Piper, A., & Wartella, E. (2019). Hey Google, Do Unicorns Exist? Proceedings of the 18th ACM International Conference on Interaction Design and Children, 301–313. https://doi.org/10.1145/3311927.3323150

Okumura, Y., Hattori, T., Fujita, S., & Kobayashi, T. (2023). A robot is watching me!: Five-year-old children care about their reputation after interaction with a social robot. Child Development, 94, 865– 873. https://doi.org/10.1111/cdev.13903

Ribino, P. (2023). The role of politeness in human–machine interactions: a systematic literature review and future perspectives. The Artificial Intelligence Review. https://doi.org/10.1007/s10462-023-10540-1

Richards, M. N., & Calvert, S. L. (2016). Parent versus child report of young children’s parasocial relationships in the United States. Journal of Children and Media, 10(4), 462-480. https://doi.org/10.1080/17482798.2016.1157502

Rideout, V., & Robb, M. B. (2020). The Common Sense census: Media use by kids age zero to eight, 2020. San Francisco, CA: Common Sense Media.

Rossi, C., & Rivetti, F. (2023). Virtual Influencer Marketing: Is It Effective in Engaging Younger Generations? European Conference on Social Media, 10(1), 231–240. https://doi.org/10.34190/ecsm.10.1.1061

Sedmak, A., Svetina, M. (2023) Components of adolescents’ attraction with YouTubers. Curr Psychol. https://doi.org/10.1007/s12144-023-04784-x

Stein, J.-P., Linda Breves, P., & Anders, N. (2022). Parasocial interactions with real and virtual influencers: The role of perceived similarity and human-likeness. New media & society, 146144482211029. https://doi.org/10.1177/14614448221102900

Sun, K., Zou, Y., Radesky, J., Brooks, C., & Schaub, F. (2021). Child Safety in the Smart Home: Parents’ Perceptions, Needs, and Mitigation Strategies. Proceedings of the ACM on Human-Computer Interaction, 5(CSCW2), 1-41. https://doi.org/10.1145/3479858

Szczuka, J.M., Guzelbey, H.S., & Kramer, N.C. (2021). Someone or Something to Play With? An empirical study on how parents evaluate the social appropriateness of interactions between children and differently embodied artificial interaction partners. In IVA ‘21: Proceedings of the 21st ACM International Conference on Intelligent Virtual Agents (pp. 191-194). https://doi.org/10.1145/3472306.3478349

Thompson, D., & Baranowski, T. (2019). Chatbots as extenders of pediatric obesity intervention: an invited commentary on “Feasibility of Pediatric Obesity & Pre-Diabetes Treatment Support through Tess, the AI Behavioral Coaching Chatbot”. Translational Behavioral Medicine, 9 (3), 448–450. https://doi.org/10.1093/tbm/ibz065

Tolbert, A. N., & Drogos, K. L. (2019). Tweens’ Wishful Identification and Parasocial Relationships With YouTubers. Frontiers in Psychology, 10. https://doi.org/10.3389/fpsyg.2019.02781

Wen, R., Hanson, A., Han, Z., & Williams, T. (2023). Fresh Start: Encouraging Politeness in Wakeword-Driven Human-Robot Interaction. In Proceedings of the 2023 ACM/IEEE International Conference on Human-Robot Interaction (pp. 112-121). https://doi.org/10.1145/3568162.3576959

Westlund, J., Park, H., Williams, R., & Breazeal, C. (2018). Measuring young children’s long-term relationships with social robots. In Proceedings of the 17th ACM Conference on Interaction Design and Children (pp. 207–218). https://doi.org/10.1145/3202185.3202732

Williams, R., Machado, C., Druga, S., Breazeal, C., & Maes, P. (2018). “My doll says it’s ok”: a study of children’s conformity to a talking doll. In Proceedings of the 17th ACM Conference on Interaction Design and Children (pp. 625–631). https://doi.org/10.1145/3202185.3210788

Xu, Y., Levine, J., Vigil, V., Ritchie, D., Zhang, S., Thomas, T., Barrera, C., Meza, M., Bustamante, A. S., & Warschauer, M. (2023). Interaction With a Television Character Powered by Artificial Intelligence Promotes Children’s Science Learning. American Educational Research Association Annual Meeting 2023.

Xu, Y., Vigil, V., Bustamante, A. S., & Warschauer, M. (2022). Contingent interaction with a television character promotes children’s science learning and engagement. Journal of Applied Developmental Psychology, 81. https://doi.org/10.1016/j.appdev.2022.101439

Xu, Y., Vigil, V., Bustamante, A. S., & Warschauer, M. (2022). “Elinor’s Talking to Me!”:Integrating Conversational AI into Children’s Narrative Science Programming. In Proceedings of the 2022 CHI Conference on Human Factors in Computing Systems. https://doi.org/10.1145/3491102.3502050

Xu, Y., & Warschauer, M. (2020). What Are You Talking To?” Understanding Children’s Perceptions of Conversational Agents. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (pp. 1-13). https://doi.org/10.1145/3313831.3376416