Introduction & Methods
This literature review examines the existing research on artificial intelligence (or, AI) in media and human relations. Although it is used in a variety of contexts and applications in various industries, the focus of this literature review will be on AI use cases within entertainment and learning applications accessible through mainstream media formats (television, computer, etc.). Peer-reviewed literature and books were evaluated for this review and found through database searches, including PubMed, Google Scholar, and the Harvard University Hollis Library database; search-specific alerts to new literature; and using literature cited in other relevant materials. This is not a systematic review that evaluates all literature available, but is a focused review that contains work chosen and considered through the discretion of the research team and the lead literature review author as most applicable to the specific focus of this brief review.
This review explores the inclusion of AI within media designed for children and how children are affected by this technology. First, this review will explore the development of parasocial relationships and how those relationships have expanded with the introduction of intelligent media characters and toys. Then this work will explore digital assistants, a specific physical application of artificial intelligence, and how they are used by children and their families. These devices are extremely common in United States households, and understanding how youth interact with them is of critical importance for understanding a growing segment of human-computer interaction.
It is important to note that, while chatbot artificial intelligence agents like ChatGPT are quickly gaining traction and are becoming widely covered by news media outlets, research regarding how children and adolescents use these applications for entertainment is extremely limited, and therefore not covered in this literature review.
Parasocial Relationships – Young Children, Media, and Toy Characters
Parasocial relationships can be defined as one-way emotional attachments between a viewer or user and a media character or artificially intelligent technology (Horton & Wohl, 1956). While the technology landscape has changed in recent years to include artificially intelligent devices with which children and adolescents may form parasocial relationships, this relationship has already been studied extensively in children and adolescents with respect to television and video game characters.
By surveying parents about their young children’s (6 mo to 8y) favorite characters and the children’s feeling about those characters, researchers were able to elucidate three factors that mediate the development of a parasocial relationship between a young child and a non-human character: 1) attachment; 2) character personification, and 3) social realism (Bond & Calvert, 2014). Attachment is the sense of security and comfort children feel when presented with the character, much like a child would seek support from a trusted adult. Character personification is the attributing of human qualities and personification of media characters, and social realism is a child’s perception that the media character is real and can exist in “real life”. Later research identified a fourth factor for parasocial relationship development: humanlike needs, or, a child’s belief that their favorite character can get hungry or sleepy, for example. (Richards & Calvert, 2016).
More About the Methods: Calvert et al., 2020
The research team created a semi-intelligent character prototype that was used in three studies discussed in the article. The character of Dora was not artificially intelligent; instead the study used a “Wizard of Oz” method, where the researcher was able to provide the socially contingent responses to the child who was participating. In the studies, 4-6 year old children interacted with Dora during a math game which sought to provide educational content about the +1 rule. For the studies and analysis procedure the article notes: “In Study 1, children’s learning was evaluated based on their parasocial relationships and parasocial interactions with the character, focusing on how quickly they answered add-1 problems correctly during the game. In Study 2, children’s parasocial relationships for learning were examined by comparing the intelligent character to an intelligent no character control version of the game, while keeping meaningful parasocial interaction prompts constant. In Study 3, children’s parasocial relationship were held constant using the same character in both conditions, but children’s parasocial interactions were manipulated using socially contingent or noncontingent character replies. Add-1 transfer problems were included to measure flexibility in moving from virtual to physical contexts in Studies 2 and 3. A robustness analysis was conducted on latency scores as a function of parasocial relationship and parasocial interaction scores by comparing the performance of children from Study 1 to a combined sample from Studies 2 and 3.”
The same research also resulted in a descriptive model of how young children’s parasocial relationships are developed. These relationships are supported by parent/caregiver encouragement; playing with toys associated with the characters; parasocial interactions (a single defined instance of media exposure to the character); and repeated exposure to the media featuring the character(s) (Bond & Calvert, 2014).
Research evaluating educational applications of parasocial relationships has shown that a child’s emotional connection to and familiarity with a character can improve task learning (Howard Gola et al., 2013; Lauricella et al., 2011). As technological capabilities have increased, a new interest in this field of research has emerged: contingency and intelligent characters (Brunick et al., 2015).
Artificial intelligence and innovative programming have allowed researchers to explore if the contingency, or reciprocation of speech, of a character with which a child has a parasocial interaction has an effect on a child’s learning and attention. One such article created an intelligent prototype of the popular character Dora the Explorer. Three studies presented in the publication found that social contingency, and the strength of a child’s feelings of attachment and friendship with the character, mediated performance on the learning task. The level of familiarity with Dora predicted faster response times to her questions across all three studies, and when Dora was socially contingent vs. noncontingent, children produced more math talk. However, the authors noted an important caveat of the findings – children performed just as quickly on the math problems during the game when there was an intelligent character or just a socially contingent voiceover present, highlighting the important role of general social contingency for young children’s learning (Calvert et al., 2020). A limitation of this work is that children were tested about the learned mathematical skill just after completing the experimental procedure, so durability of knowledge acquisition is unknown from these experiments.
Other recent research has found similar positive effects for learning with characters who are artificially intelligent. Converse to Learn, a project led by Ying Xu and Mark Warschauer, has pioneered work using what they refer to as “dialogic characters powered by artificial intelligence”. Familiar characters like Elinor from PBS’ programming have been modified to become intelligent characters that can interact with and respond to children (Xu et al., 2022a; Xu et al., 2022b; Xu et al., 2023). Communication strategies which support effective communication with children were embedded into the conversational Elinor character. The researchers ensured Elinor would ask questions, offer feedback on the topic, and provide scaffolding. In addition, Elinor was designed to address NGSS’s (Next Generation Science Standard) scientific learning goals in her questioning (Xu et al., 2022a).
A usability study found that children responded to 92.8% of Elinor’s questions, and 87.8% of those responses were on-topic. Additionally, both parents and children expressed positive perceptions of the program (Xu et al., 2022a). Further investigations using the Elinor conversational episodes have shown that children who had contingent interaction with Elinor (that is, Elinor responded to the children in real time and with appropriate feedback to their response to her question) performed better on a science posttest than those who watched an episode of the show without any interaction from Elinor (Xu et al., 2022b) or a pseudo-interactive Elinor, although the latter finding was only marginally significant, requiring further study (Xu et al., 2023). Continuing research from Converse to Learn will seek to expand the conversational character’s comprehensive abilities to understand users who may be less fluent in English and who may respond to Elinor’s questions in Spanish (Xu et al., 2023).
Technological advances have also allowed for the development of “smart toys” which incorporate AI and interact with their young users. While this may include toys which are clearly robotic in appearance, other toys like stuffed animals and dolls may also be integrated with AI capabilities. Research has shown that children believe that their robot playmates have mental states like feelings, could be social companions, and should be treated with respect and fairness (Francis & Mishra, 2009; Kahn et al., 2012; Westlund et al., 2018). Children may also alter their actions in order to maintain a positive reputation around social robots (Okumura et al., 2023). Although not fully conversational, early work with a smart toy doll showed that the doll could influence children’s responses to questions regarding moral judgements like whether it was okay to tease another child, but not to disobey instructions to not open a box before a certain amount of time passed (Williams et al., 2018).
Additionally, a child’s age may influence their perceptions about a robot or smart toy. Younger children indicate a greater likelihood to allow their play and experience with the toy to shape their understanding of its anthropomorphic qualities like intelligence and social capabilities rather than having preconceived ideas about its qualities due to its technological appearance or description (Druga et al., 2018; Kahn et al., 2012). Social robots and smart toys are growing in complexity and availability, so relationships between children and these entities should continue to be monitored for understanding as the technology evolves (see Kahn et al., 2013).
A developing topic in artificial intelligence research is the idea of a virtual influencer. Research has shown that tweens and adolescents can have parasocial relationships with their favorite influencers on social media (Bond, 2016; Sedmak & Svetina 2023; Tolbert & Drogos, 2019). A scientific understanding of how parasocial relationships may form with these non-human influencers is lacking, with recent work beginning to explore this in young adults (ex. Rossi & Rivetti, 2023; Stein et al., 2022), but further research is needed to understand more about older children and adolescents, who are often targets of influencer marketing and are widespread users of social media.
Introduction to Interactions With Digital Assistants
Digital assistants, also commonly referred to as “smart speakers” or “virtual assistants” are becoming increasingly prevalent in homes. Common Sense Media found that, in 2020, 41% of surveyed homes had a smart speaker, an increase of 32% from the previous survey in 2017. Additionally, children are using these devices quite frequently; 25% of children aged 0-8 often or sometimes interact with these devices, most commonly to play music or “talk/fool around with” them (Rideout & Robb, 2021).
More About the Methods: Lovato et al., 2019
Google Home Mini devices were provided for use in family homes (who did not previously have a smart speaker) with children age 5-6 years old. Two home visits were part of the study. During the first visit, after consent was obtained, the researcher collected demographic information and the child’s reading level, and provided information about how to use the device. Families were told that the researchers would have access to a log of their interactions with the device. Use was monitored remotely by the research team during the 2 to 3 weeks of the study (average 14.28 days). At the end of the period, child and parent pairs were interviewed in their home about their experience with the device and asked questions which the child rated on a six-point smiley-o-meter scale; questions included whether or not the child thought the device was friendly, smart, alive, trustworthy, etc. The data from the device was transcribed and analyzed to determine usage and types of interactions with the device.
When young children (between 5 and 6yo) are talking to these devices, Lovato et. al. (2019) found that they are most frequently asking questions relating to science and technology, culture, or practical questions (like the current weather). These conversational moments are of particular interest for understanding the social dynamics between children and these devices. Research before the widespread adoption of these devices showed that children can often distinguish between living and non-living things, but can still prescribe psychological qualities to non-living objects like robots (Jipson & Gelman, 2007). This has implications for what children believe about these digital assistants, if they do indeed perceive these devices in similar ways to robots and smart toys. Recent research has sought to expand this idea to understand the relational qualities of children and virtual assistants.
Lovato et. al.’s experiment (2019) also asked children about their conceptions of the smart speaker they were using. While many children did not give the speaker a name and described it as a speaker, many did attribute human actions to the speaker. When asked how the device knows the answers to the questions they are asking, most described the speaker as “looking up” the answer on something like a smartphone or a search engine online. Eleven of the 18 children described the device as alive because it could talk or sounded like a person (Lovato et al., 2019).
In another study of children aged 6-10 years old, 93% of participants said that a digital assistant they were familiar with was smart, and 65% responded that the device can be a friend. However, 83% agreed that it is okay for someone to sell the digital assistant, furthering evidence from other research that children’s understanding of digital assistants is nuanced in relation to various attributions of sentience and autonomy (Girouard-Hallam et al., 2021; for further examples of nuanced interpretations see Lovato et al., 2019 and Xu and Warschauer, 2020).
In another similarity to robot or smart toy attribute interpretation, there is evidence that age plays a role in a child’s understanding of digital assistants, with younger children prescribing more human-like qualities or more frequently believing the digital assistant to be human (ex. Andries & Robertson, 2023; Girouard-Hallam et al., 2021; Xu & Warschauer, 2020) In Girourard-Hallam et al. (2021), for example, exact age, but not experience with the internet, predicted whether children would want to spend time with the device when lonely, with younger children espousing that preference more frequently. Garg & Sengupta (2020) note in their research that young children (5-7) were most likely to ascribe person-like qualities and believe it had feelings, thoughts, and intentions, and older children (7-15) often tested the intelligence of these devices, but believed them helpful for studying, managing schedules, and entertainment.
When exploring politeness toward digital assistants, preliminary research with adults and digital assistant “wakewords” which awaken the digital assistant, suggested that words and phrases that syntactically prime users to then use a full sentence after (ex. “Excuse me, Alexa”) may actually be more effective at increasing politeness than a phrase like “Alexa, please . . “ (Wen et al., 2023). Research with children is emerging and mixed; one study shows that many children believe that it is not acceptable to be rude to the device (Andries & Robertson, 2023), but other research has shown that children may be less likely to try and repair miscommunication with a voice assistant than with a voice-only human interaction and share less information with voice assistants about a task (Gampe et al., 2023). Social emotional learning (SEL) skills are already embedded into some digital assistants. However, preliminary analysis of these find them lacking in the ability to effectively instill SEL skills due to design attributes including minimal or non-existent meaningful contingent responses (Fu et al., 2022). A recently published review acknowledges there is concern that impolite exchanges between children and digital assistants may affect their communications with people, but notes “no scientific evidence can be considered statistically significant to provide a definite answer to the debate since experiments have been conducted on a small number of participants” (Ribino, 2023). Further research is needed to understand the relationship.
There is also recent evidence that children may form parasocial relationships with digital assistants in similar ways to their favorite media character or smart toy (Girouard-Hallam et al., 2021; Hoffman et al., 2021). While there is some evidence that parents may be concerned about this dynamic (ex. Garg & Sengupta, 2020) more research will be needed to understand the relationships between children and digital assistants and parental views of those parasocial connections.
Parenting, Family Dynamics, and Digital Assistants
Just as children and adults may develop personal beliefs and relationships with digital assistants, recent research is also seeking to understand how digital assistants may affect family dynamics. In addition, with the introduction of new technology into the family sphere, parents’ opinions of digital assistants and their hopes, and fears, about how digital assistants can support their child(ren)’s learning are also critical pieces of knowledge in order to evaluate the adoption and usage of these devices, especially as their adoption grows.
Safety, both physical and digital, is a major consideration of parents even before a smart device like a digital assistant is added to the home, and parents continue to assess developmental needs and usage patterns that affect how their children interact with home technology and therefore may affect safety even after it is integrated into their lives (Sun et al., 2021). In a study where parents viewed interactions between two children, a child and a robot, and a child and a digital assistant, privacy concerns were significantly higher in the digital assistant condition (Szczuka et al., 2021). Older children may also share these security and privacy concerns (Garg & Sengupta, 2020)
When a family does choose to use a digital assistant, its presence can affect familial interactions in a myriad of ways, both positive and negative. Scaffolding, also called fostering communication, is very common in family interactions with digital assistants. Parents and older children, for example, are shown to support younger children with using digital assistants effectively or helping a digital assistant understand something by repeating a family member’s request (Beirl et al., 2019; Beneteau et al., 2020). In terms of family cohesion, while the digital assistant’s presence resulted in times of shared laughter, it also sometimes exacerbated family conflict, like when trying to choose music to play and family members had different tastes, or having a family member cut off whatever the digital assistant was doing by telling it to stop (Bierl et al., 2019; Beneteau et al., 2020). Parents may also use digital assistants to augment their parenting and incorporate it into family processes that mediate behavior or support autonomy (Beneteau et al., 2020).
More About the Methods: Beirl et al., 2019
Six families recruited from the South East UK were given an Alexa product to use at home over 3 weeks. A researcher visited each family for approximately an hour at the beginning and end of the study to review their experience; these visits were recorded on camera. The Alexa was programmed with three skills to try over the first week, and then families were able to download additional skills for the following two weeks. All of the interactions with Alexa were recorded and transcribed for analysis.
Parent interviews about their views of digital assistants supporting SEL skills found that while parents positively envisioned ways in which digital assistants could support family norms and encourage politeness, they also expressed concerns about the technology displacing human interactions, disrupting their parental relationship with their child, and teaching values misaligned with their own (Fu et al., 2022). Further research on SEL and digital assistants will be imperative to understand how families would prefer to use digital assistants in a learning context.
Takeaways and Further Questions
- Children can form parasocial relationships (one-way emotional attachments) with media characters. Early research indicates that children may form similar relationships with AI-enabled digital assistants. Further research is needed to more deeply understand how these relationships can be exploited for developmental, educational, and wellbeing benefits.
- Children’s attribution of intelligence and autonomy to AI-enabled toys, robots, and digital assistants is nuanced. Intelligence is often attributed to these devices but autonomy to do things like vote or not be owned is attributed far less often. Further study is needed to better understand these attributions and their implications.
- While there is some research on how children and adolescents interact with chatbots, the focus tends to be on educational and medical capacities (ex. Thompson & Baranowski, 2019). Given the rapid expansion of public availability of these tools, further research is needed to understand how children and adolescents are interacting with chatbots for entertainment, and the impacts of these interactions on young users’ wellbeing and development.
- With the rapid adoption of digital assistants and smart speakers in households around the globe, it is imperative to understand how children comprehend and critically examine information offered by these devices.
- With increasing age, children are less likely to seek out devices when they are lonely and less likely to ascribe humanistic qualities to AI-enabled characters and devices. There is a suggestion that a shift happens around seven years old, due to children’s increasing abilities to distinguish between reality and fantasy at that developmental stage, but there hasn’t yet been a clear “cut off” determined where we can be reasonably certain about a child’s ability to understand the limitations of an AI character (and the implications of that shift).
- There is recent evidence that using digital assistants within a family context can both help and hinder familial interactions. While digital assistants may support communication between parents, siblings, and young children learning how to interact with the device, they may also exacerbate or create tension by the nature of communal access. More research is needed in order to understand the complex needs of families and digital assistants, as well as to discover parental views about educational contexts of digital assistants.