Virtual CME course on PIMU

Valuable Insights Gained at the Family Online Safety Institute Conference

The Lab's Dr. David Bickham on a FOSI 2023 panel

Earlier this week, our team participated in the Family Online Safety Institute (FOSI) Annual Conference in Washington D.C., which focused on new frontiers in online safety. At this conference, FOSI unveiled their latest research report on Generative AI, and nearly all of the panels and breakout sessions were focused on various facets of online safety in the age of Artificial Intelligence (AI).

As Arjun Venkataswamy of Amazon shared during a morning plenary on AI and the future of learning, these tools can be viewed as content collaborators and process supports, accelerating our journey before humans begin the critical thinking work involved in writing.

To stay on trend, and to test this theory, we decided to write the first draft of this post in ChatGPT!

Youth Insights

During a session titled Listen Up: What Young People Want from Platforms, Policymakers, & Parents, we heard directly from young adults describing their experiences online and recommendations for how to better support the youngest members of the digital ecosystem. Some of their suggestions included:

  • Prioritize youth voices and opinions when making policy and platform design decisions. Consult users of all ages at key points in development processes.
  • All youth are not the same. They have diverse challenges and needs when they engage with technology. It’s important to listen to what young people have to say, and ensure that we seek input from a truly diverse set of young stakeholders.
  • Pursue ongoing conversations about the digital world, even when everything seems to be going well. Viraj Doshi from Snap noted in an earlier session that Snap’s data show that teens whose parents or primary caregivers have open conversations with them about their digital lives have more trusting relationships and better digital wellness outcomes overall. Matthew Johnson from Media Smarts called this ongoing conversation the “most important thing you can do to keep kids healthy and safe online”.
  • Ensure those conversations don’t happen only when things have gone awry. One young panelist noted that “[my parents] only talk to me about the internet when I’m in trouble.”  NAMLE’s Megan Fromm referenced this concept in a later session, highlighting the importance of removing shame from conversations about online engagement to avoid seeming accusatory and to encourage open, helpful discussion about important safety and wellness topics.
  • Young people are aware of and troubled by unrealistic beauty standards, lack of diverse representation, and the effects of algorithms on the potentially harmful messaging they receive. Youth panelist Abby Schmid of LGBT Tech reinforced the importance of young people being able to curate their online experiences through Settings and in-platform interfaces, based on what they know is best for them. Our recently published Recommendations for Industry offers simple design shifts that tech developers can implement to allow users flexible ways to individualize their experience.

Security & Privacy

Several sessions focused on helping young people stay safe and safeguard their privacy within AI-enabled experiences. Tom He of Mattel reminded us that we are in the “first phase of AI” and there is much still to learn, not only about potential misuse but also opportunities.

Conversations highlighted the complexity involved in balancing privacy with parental monitoring. A nuanced and delicate issue, designing accessible user supports for emerging technologies can be exceptionally challenging.

  • Andrew Hasbun of Uber noted the company addressed these issues from a lens of consent: if everyone is on the same page (in his company’s case, teen rider, parent, and driver), fewer issues are likely to occur.
  • In Snap’s case, as Viraj Doshi shared, the company works to balance teen agency and parental monitoring. Teens can see their parent/guardian’s view within an ecosystem designed to spark conversations between teens and their parents or primary caregivers about what they are doing and who they are engaging with on Snap. Learn more about this issue in our Research Brief Safety and Surveillance Software Practices as a Parent in the Digital World). We also appreciated Xbox’s Kim Kunes reminding the audience that plenty of internal research goes into how to keep users engaged, and that same research can and should be leveraged to keep people safe while they’re on the platform.

Finally, Anup Kaneri of Smith Micro highlighted the absolute importance of ensuring that the educational tools to learn about security, privacy, and emerging tech are getting into the hands of young people and their families and educators. We’ll share recommendations on how safety tools and community guidelines can best be built into platform design in our upcoming new series for Industry called The Short Answer.

Mental Health & Wellness

The thread that tied all sessions together was a focus on young people’s long-term wellness within a digitally saturated, AI-enabled world. Mental health outcomes are based on a complex set of variables. As Mitch Prinstein of the American Psychological Association noted, “I don’t think that we have the research to demonstrate that social media or tech is the singular cause of the youth mental health crisis.” Even if it’s not the root cause, it is a key element of young people’s lives. We can’t address their mental, social, and emotional health without addressing their online experiences.

Many speakers, including youth panelists, agreed that it’s not age but competency that determines a child’s readiness for a given online experience. Children develop and learn new skills and approaches at different rates and in different ways. As Dr. Bickham notes, we have an obligation to prepare our kids for AI, and all technologies. Some suggestions panelists offered for doing so include:

  • Modeling the behaviors you want to see (one of our 5Ms!) As adults, we need to learn to set aside our own technology, moderate our own media habits, and talk to the children in our care about what we’re doing and why with our tech.
  • Support the development of healthy tech behaviors from a young age. Richard Culatta of the International Society for Technology in Education (ISTE) noted that, if you don’t provide learning opportunities throughout the developmental span by talking to kids about tech and media as they grow up, you are ceding the opportunity to teach key lifelong skills and behaviors to their peers. Caregivers can begin conversations on digital literacy by encouraging youth to be curious about why these tools exist and allow them the autonomy to learn more.
  • Teach young people to reflect on their media habits. One recommendation, from Megan Fromm of NAMLE, is to take three days and, each time you or your child reaches for your phone or watches a tv show, do a quick emotional check: How are you feeling when you reach for the media experience and how are you feeling after? Such self-reflection can help you to understand what purpose your tech or media habits are serving and enable you to either lean into those purposes or find more productive solutions in the future.

Overall, a key takeaway for our team was the emphasis on digital literacy and fun, innovative learning formats as we enter a new AI-augmented age of technology. We were excited by the potential of leveraging  the reach and accessibility of popular platforms to educate young people for the digital present and future. These themes will be integrated into our 2024 research agenda and beyond, to ensure we are learning and applying the most up-to-date information about how kids are using and are affected by tech and media.

Verdict: Inputting our notes into ChatGPT saved a substantial amount of time in laying out an outline, proposing key themes, and deciding on introductory statements. We still had to organize the information in a way that achieved our goal of sharing what we learned, and our unique point of view, with readerse. True to Arjun’s prediction, AI accelerated our movement to the critical thinking part of writing!