It Will Take a Village to Ensure an Authentic Future for Generation Beta
By: Susan-Caitlyn Seavey
Introduction
One of the many glaring issues that future generations will face is the decline in frequency of in-person human interactions. Today’s technology, especially artificial intelligence (AI) offers unparalleled tools that can be used for the betterment and progression of humanity. For example, new customer service bots, called “conversational agents” are responding to customer inquiries with efficient, personalized and human-like responses, “reshaping how we engage with [ ] companies, [and] creating a world where efficiency meets empathy–or at least an impressively convincing facsimile of it.”[1] AI software is also providing efficiency for individuals through multitasking functions, auto-generated answers to questions, and draft responses to texts and emails, saving the user valuable time. However, this technology can also create unrealistic standards and attractive environments that isolate individuals from their reality. Around the globe, AI technology is becoming more normalized and ubiquitous with software like co-pilot in the workplace and AI robots as companions, friends and romantic partners at home. The rapid development is “particularly concerning given its novelness, the speed and autonomy at which the technology can operate, and the frequent opacity even to developers of AI systems about how inputs and outputs may be used or exposed.”[2] We face the challenge of balancing the benefits of efficiency and progression of this technology with the risk of being fully consumed by it, and at the cost of our youngest members of society.
This powerful technology should be used to embrace reality and continue striving for a better world; one that actually exists off of a screen. Jennifer Marsnik summarized this challenge well by contemplating how society can “maintain authenticity, human intelligence and personal connection in a landscape increasingly dominated by algorithms, data and automation.”[3] Young minds are the most susceptible to the unrealistic standards and depictions AI can create. Considering the difficulty even adults can sometimes have when determining whether a visual is real or generated by AI, the young generations with their still-developing minds will evolve in this landscape of not always knowing what is authentic and what is not. If society fails to provide safeguards and implement protections around children and their use of our ever-progressing technology, we could end up with future generations being stuck in a perpetual a cycle of unrealistic expectations and disappointment in the real world, prompting more isolation and leading to the degradation of communities. Preserving authentic relationships and interactions with the real world will require a village: Congress must support new and developing legislation for online safety for children, companies should adopt management frameworks and clearinghouse functions to ensure transparency and accountability in consumer engagement, and parents, teachers, and community leaders must work together to encourage social-emotional learning and in-person interactions in children and teens at home, at school, and in their communities.
The Coronavirus as a Model
The Coronavirus (COVID) pandemic was a unique opportunity for the population to experience a largely online integrated world. The pandemic launched schools and workplaces into new methods of remote online learning and work at an intense level. While these years showed us that we could keep operating fully remote, it also highlighted the downsides of isolation and revealed the negative effects on people when they endure a lack of in-person engagement for long periods of time. We know that mental health suffers and that social interaction is a necessity for individual well-being and for community health. Based on an article in The Atlantic, “face-to-face socializing among [American] teenagers fell by more than 45 percent” between 2003 and 2022.[4] The pandemic is responsible for some of this decline, but the erosion of in-person interaction predates the pandemic. “For many young people, the lack of face-to-face interaction that was imposed by the pandemic intensified their mental health struggles and diminished their social skills at an age when they were still developing.”[5] Tarani Chandola, a Professor of medical sociology, and postdoctoral research fellow Patrick Rouxel[6], saw COVID as a natural experiment that forced diminished in-person contact and interactions which produced an increase of mental health problems, especially in young adults.[7] Chandola and Rouxel’s research looked at life-course theories and research done in the United Kingdom by the Center for Longitudinal Studies through surveys of four national cohort studies during the COVID pandemic.[8] The results showed that the “reduction in mental health associated with reduced in-person social contract during the pandemic was not offset by online or telephone modes of social contact.”[9] Their study found “consistent evidence of a small but statistically significant effect of less frequent social contact on anxiety-depression,” and specifically, “young adults who increased their online social media frequency during the pandemic experienced a deterioration in mental health.”[10]
Technology impacts the social development and wellbeing of children and teens in many ways. Through a variety of interpersonal experiences and interactions, “children develop a whole host of skills they’ll use for the rest of their lives–capacities such as behavioral and emotional regulation, feelings of connection and caring for others, self-confidence, and the ability to understand how their actions and behaviors affect others.”[11] Social psychologist Jonathan Haidt “notes that a ‘child’s brain is already 90 percent of its adult size by about age six, [and the subsequent] ten to fifteen years are about learning norms and mastering skills—physical, analytical, creative, and social’.”[12] Several “experts believe the centrality of screens in the lives of young people is playing a significant part in the crisis of anxiety, depression, and loneliness among adolescents.”[13] Society has experienced, through scientific research and COVID, that social interactions and in-person engagement are key to a healthy and smoothly functioning society.
It Will Take a Village to Ensure an Authentic Future.
To ensure a sound future, children’s engagement with technology will require the attention and intentions of the adults in their lives, as well as the responsibility of companies and the government. While legislation and policies are immensely important to children’s safe engagement with technology, parents, teachers, and community members at large also have a key role to play in the authentic development of future generations. As Amina Fazlullah and Ariel Johnson note, “AI is transforming how children learn and live[;] and policymakers, industry, and educators owe it to the next generation to set in place a responsible policy that embraces this new technology while at the same time ensuring all children’s well-being, privacy, and safety is respected.”[14] To instill and reinforce healthy social connections and the importance of building authentic relationships, “a collective effort [of] parents, educators, communities, and society as a whole [is required] to address th[ese] issue[s] effectively.”[15]
One high school English teacher reflects on the academic year, concerned with how “dozens of [her] students consistently found it preferable to use Google for a one-sentence assignment submission rather than . . . think[ing] of an original idea . . . [and how] [n]o amount of conferencing, contacting parents, or detracting from grades seemed to deter the behavior entirely; many of the students who were penalized early still made multiple cheating attempts throughout the year.”[16] It is true that technologies like ChatGPT can be a useful tool, but it has “reached a state that would allow any of these kids to submit work without ever opening a book or thinking at all about what the text means—without ever really learning anything.”[17] Kay posits that if this technology is not regulated for the upcoming generations, “will waste away consuming content but never meaningfully doing our own creative work . . . change[ing] our ability to be satisfied with ourselves and our lives, and lose[ing] access to beauty and human connection that we currently have.”[18] In October 2023, a GALLUP poll reported that an American teenager spends an average of 4.8 hours on social media per day.[19] That does not even account for other forms of screen time like television or school work. It is relevant that this data is a couple years old, and the number of hours is likely to have increased since then. While it is true that “social media platforms have the potential to foster connection . . . especially [ ] for young people who might otherwise be isolated,”[20] there are developmental and mental health risks when children “use screens to replace in-person interactions rather than complement them.”[21]
One approach in solving this issue is found in the duality of creating and enforcing technology boundaries while also incentivizing in-person interactions. Adults can encourage face-to-face interactions through involvement in “sports, clubs, community events, or volunteer work.”[22] Community members, like business chambers of commerce, rotary clubs, and libraries can collaborate on local events and volunteer opportunities to engage families in-person and foster community connections and relationships. Other suggestions include parents and teachers creating phone-free spaces, school policies prohibiting cell phone use during instructional time, encouraging and supporting social-emotional learning (commonly implemented in middle schools), and creating family medial plans to help balance screen time.[23] Lastly, but not least, a tool to foster open communication and human interaction is to lead by example and to personally disengage from technology yourself.[24] Adults can and should strive to help “adolescents develop a healthy relationship with their devices and to discover the many rewards of engaging with the world and developing in-person social connections.”[25] By “demonstrat[ing] the importance of maintaining meaningful relationships by actively engaging in offline social activities [and] prioritizing face-to-face interactions,”[26] children and teens can learn to model these intentional behaviors, leading to a healthier future.
A Framework and Policy Guidelines
In addition to parents, teachers, and community members, we also need clear laws and policies outlining protections and safeguards for children and teens interacting with technology. There are “serious worries that AI systems are more capable than humans across all domains” and will “escape human control . . . absent meaningful regulatory intervention.”[27] Code does not “explicitly determine these systems’ behavior, [therefore] it is impossible to demonstrate compliance with a given regulatory specification.”[28] The “[p]olicymakers must recognize that code is no longer law . . . [s]ince its behavior is neither expressed intentionally by designers in software program code nor legible (yet) by examining the program code and its massive array of tuned parameters, [leading to] significant implications for debates about the governance and regulation of generative AI.”[29] Fazlullah and Johnson believe that “the federal government should develop clear prohibitions, enforce them, and serve as a national clearinghouse for AI K-12 educational policy.”[30] They suggest three policy agendas: “build a coordinated framework for AI Safety; champion legislation to support youth privacy and online safety in AI; and ensure every child can benefit from the promise of AI.”[31]
The framework for AI safety should involve developing “safety guidance for developers, including specific prohibitions to limit harmful designs and inappropriate use” and “direct the National Institute of Standards and Technology (NIST) to serve as the lead organizer for federal efforts on AI safety for children.”[32] The legislature should “support the passage of online safety laws that address harmful design features” and “pass youth privacy laws to explicitly address AI data use issues, including prohibiting developing commercial models from students’ educational information, with strong enforcement mechanisms.”[33] These structures and safeguards should be created in a way that prevents the deepening the digital divide by “support[ing] comprehensive digital literacy efforts.”[34] Furthermore, companies should be incentivized to act and policy should include a private right of action. The policies and frameworks cannot be voluntary, “enabling companies to pick and choose whether or not to follow recommendations.”[35] We have seen that “[w]hen we do not put in place guardrails for tech” there is an “increased risk of child addiction, depression and self-harm.”[36]
Congress and administrative agencies could also look to other countries and consider what steps they are taking for their younger generations. The European Union passed their EU AI Act in early 2024 and may be a useful model for approaching and regulating AI systems in the United States. Legislatures can also consider what individual states are doing; innovative policy proposals aiming to protect and regulate child and teen interactions with technology are moving through Colorado and California, among other states.[37] An AI framework can also work in tandem with existing protective laws like the Children’s Online Privacy Protection Rule (COPPA) and the Family Educational Rights and Privacy Act (FERPA). The FTC approved amendments updating COPAA on January 16th, 2025, including new requirements on what “companies must include in their direct notice disclosures to parents prior to collecting from and using their children’s personal information.”[38] Another change requires “separate parental consent for the disclosure of a child’s personal information to a third party for target advertising or other uses . . . unless such disclosure [is] necessary to provide the product or service.”[39]
In conclusion, it will take “great effort to preserve human values when new technologies make it so easy to maximize efficient operations.”[40] These efforts require the work of parents, teachers, community leaders, legislators, regulators, and companies. “AI holds the potential for both unprecedented benefits and unprecedented harms . . . [and] [t]here is an urgent need to build the regulatory paradigm and state capacity to meaningfully govern AI.”[41] It “is essential to take proactive steps and create environment[s] that promote healthy social interactions” amongst these young populations.[42] Society should not let technology and AI create a space where the young can disregard the imperfections and hostilities of the real world in exchange for the pseudo-safe and utopian environments that social media, advertisements, and the entertainment industry can create. This potential dissonance in understanding what is real and authentic versus what is not puts genuine relationships and experiences at risk when they do not live up to AI generated standards, affecting the health and well-being of the individual and rippling out to the communities which rely on relationships and connections to thrive.
References
[1] Craig S. Smith, Efficiency Meets Empathy: How AI is Redefining Customer Service, FORBES (Dec. 16, 2024), https://www.forbes.com/sites/craigsmith/2024/12/16/efficiency-meets-empathy-how-ai-is-redefining-customer-service/ (discussing another company called Sierra, which is building AI agents and a “digital workforce imbued with the personality of your favorite brands.” These AI agents can “process exchanges, update subscriptions, and even schedule deliveries.” Sierra co-founders Taylor and Bavor aim to help companies engage with their customers, and state that “conversational AI represents a new, decade-defining category of software . . . hopefully driv[ing] productivity in ways we haven’t yet imagines.”).
[2] Amina Fazlullah & Ariel F. Johnson, An Agenda for Ensuring Child Safey in the AI Era, Federation of American Scientists (Jan. 12, 2025), https://fas.org/publication/ensuring-child-safety-ai-era/ (Ariel is the founder of Digital Smars Law & Policy LLC, where she provIdes research and advocacy support to nonprofits and organizations improve their digital space, and an adjunct professor of privacy law at Cleveland State University’s College of Law. Amina is the head of Tech Advocacy Policy in Common Sense Media’s DC office and a member of the FCC’s Consumer Advisory Committee.).
[3] Jennifer Marsnik, Authenticity in an AI World, ALM (Feb. 1, 2025), https://www.lawjournalnewsletters.com/2025/02/01/authenticity-in-an-ai-world/?srsltId=AfmBOoql4sEV8qyOqrgu9LY1qYBEYtnN_HaJ_fwdBHXCBHPUzVvquNrw&slreturn=20250209122115.
[4] The Decline of Teenage Social Skills, Effective School Solutions (Apr. 26, 2024), https://effectiveschoolsolutions.com/teenage-social-skills/.
[5] Id.
[6] See Partick Rouxel & Tarani Chandola, No Substitute for In-Person Interaction: Changing Modes of Social Contact during the Coronavirus Pandemic and Effects on the Mental Health of Adults in the UK, NATIONAL LIBRARY OF MEDICINE; NATIONAL CENTER FOR BIOTECHNOLOGY INFORMATION (May 15, 2023), https://pmc.ncbi.nlm.nih.gov/articles/PMC10189533/ (Authors are both professionals employed at the University of Hong Kong).
[7] Id.
[8] Id.
[9] Id.
[10] Id.
[11] The Decline of Teenage Social Skills, supra, note 4.
[12] Id.
[13] Id.
[14] Fazlullah & Johnson, supra note 2.
[15] Citizen Advocates, Combating the Decreased Social Interaction Among Teens: Steps for a Healthier Future, CITIZENS ADVOCATES (Jun. 30, 2023), https://citizenadvocates.net/blog/it-is-essential-to-take-proactive-steps-and-create-an-environment-that-promotes-healthy-social-interactions-among-teenagers/.
[16] Natalie Kay, AI is here, and its harming future generations (and the present ones), MEDIUM (Dec. 9, 2022), https://medium.com/predict/ai-is-here-and-its-harming-future-generations-and-the-present-ones-cf40985b9bbf.
[17] Id.
[18] Id.
[19] Jonathan Rothwell, Teens spend an average of 4.8 Hours on Social Media Per Day, GALLUP (Oct. 13, 2023), https://news.gallup.com/poll/512576/teens-spend-average-hours-social-media-per-day.aspx.
[20] The Decline of Teenage Social Skills, supra, note 4.
[21] Id.
[22] Citizen Advocates, supra, note 15.
[23] See The Decline of Teenage Social Skills, supra, note 4.
[24] Id.
[25] Id.
[26] Citizen Advocates, supra, note 15.
[27] Brian Judge, et at., When code isn’t law: rethinking regulation for artificial intelligence, OXFORD UNIVERSITY PRESS University of California, Berkeley at 2 (May 29, 2024), https://doi.org/10.1093/polsoc/puae020.
[28] Id.
[29] Id.
[30] Fazlullah & Johnson, supra note 2.
[31] Id.
[32] Id.
[33] Id.
[34] Id.
[35] Id.
[36] Id.
[37] See Shelby Dolen, Colorado Legislature Passes Children’s Data Privacy Bill, Huschblackwell (May 16, 2024), https://www.bytebacklaw.com/2024/05/colorado-legislature-passes-childrens-data-privacy-bill/ and Jeewon K. Serrato, et al., California’s AI Laws are Here–Is Your Business Ready?, Pillsbury Winthrop Shaw Pittman LLP (Feb. 7, 2025), https://www.pillsburylaw.com/en/news-and-insights/california-ai-laws.html.
[38] FTC Updates to the COPPA Rule Impose New Compliance Obligations for Online Services That Collect Data from Childres, Gibson Dunn (Jan. 28, 2025), https://www.gibsondunn.com/ftc-updates-to-coppa-rule-impose-new-compliance-obligations-for-online-services-that-collect-data-from-children/ (client alert summarizing the complexities of the amended COPPA Rule and the need for companies to realign their compliance within tight timeframes as child safety remains a central concern for the FTC).
[39] Id.
[40] Tim Wu, Will Artificial Intelligence Eat the Law? The Rise of HybrId Social-Ordering Systems, 119 COLUM. L. REV. 2001, 2028 (2019), https://scholarship.law.columbia.edu/cgi/viewcontent.cgi?article=3602&context=faculty_scholarship.
[41] Judge, supra, note 27.
[42] See Citizen Advocates, supra, note 15.