Hostname: page-component-848d4c4894-2pzkn Total loading time: 0 Render date: 2024-05-18T05:43:35.661Z Has data issue: false hasContentIssue false

ALGORITHMS, ADDICTION, AND ADOLESCENT MENTAL HEALTH: An Interdisciplinary Study to Inform State-level Policy Action to Protect Youth from the Dangers of Social Media

Published online by Cambridge University Press:  12 February 2024

Nancy Costello*
Affiliation:
Michigan State University, East Lansing, MI, USA
Rebecca Sutton
Affiliation:
Michigan State University, East Lansing, MI, USA
Madeline Jones
Affiliation:
Michigan State University, East Lansing, MI, USA
Mackenzie Almassian
Affiliation:
Michigan State University, East Lansing, MI, USA
Amanda Raffoul
Affiliation:
Harvard Medical School, Cambridge, MA, USA
Oluwadunni Ojumu
Affiliation:
Harvard College, Cambridge, MA, USA
Meg Salvia
Affiliation:
Harvard T.H. Chan School of Public Health, Cambridge, MA, USA
Monique Santoso
Affiliation:
Stanford University, Stanford, CA, USA
Jill R. Kavanaugh
Affiliation:
Harvard T.H. Chan School of Public Health, Cambridge, MA, USA
S. Bryn Austin
Affiliation:
Harvard T.H. Chan School of Public Health, Cambridge, MA, USA
*
Corresponding author: Nancy Costello; Email: costel29@msu.edu

Abstract

A recent Wall Street Journal investigation revealed that TikTok floods child and adolescent users with videos of rapid weight loss methods, including tips on how to consume less than 300 calories a day and promoting a “corpse bride diet,” showing emaciated girls with protruding bones. The investigation involved the creation of a dozen automated accounts registered as 13-year-olds and revealed that TikTok algorithms fed adolescents tens of thousands of weight-loss videos within just a few weeks of joining the platform. Emerging research indicates that these practices extend well beyond TikTok to other social media platforms that engage millions of U.S. youth on a daily basis.

Social media algorithms that push extreme content to vulnerable youth are linked to an increase in mental health problems for adolescents, including poor body image, eating disorders, and suicidality. Policy measures must be taken to curb this harmful practice. The Strategic Training Initiative for the Prevention of Eating Disorders (STRIPED), a research program based at the Harvard T.H. Chan School of Public Health and Boston Children’s Hospital, has assembled a diverse team of scholars, including experts in public health, neuroscience, health economics, and law with specialization in First Amendment law, to study the harmful effects of social media algorithms, identify the economic incentives that drive social media companies to use them, and develop strategies that can be pursued to regulate social media platforms’ use of algorithms. For our study, we have examined a critical mass of public health and neuroscience research demonstrating mental health harms to youth. We have conducted a groundbreaking economic study showing nearly $11 billion in advertising revenue is generated annually by social media platforms through advertisements targeted at users 0 to 17 years old, thus incentivizing platforms to continue their harmful practices. We have also examined legal strategies to address the regulation of social media platforms by conducting reviews of federal and state legal precedent and consulting with stakeholders in business regulation, technology, and federal and state government.

While nationally the issue is being scrutinized by Congress and the Federal Trade Commission, quicker and more effective legal strategies that would survive constitutional scrutiny may be implemented by states, such as the Age Appropriate Design Code Act recently adopted in California, which sets standards that online services likely to be accessed by children must follow. Another avenue for regulation may be through states mandating that social media platforms submit to algorithm risk audits conducted by independent third parties and publicly disclose the results. Furthermore, Section 230 of the federal Communications Decency Act, which has long shielded social media platforms from liability for wrongful acts, may be circumvented if it is proven that social media companies share advertising revenues with content providers posting illegal or harmful content.

Our research team’s public health and economic findings combined with our legal analysis and resulting recommendations, provide innovative and viable policy actions that state lawmakers and attorneys general can take to protect youth from the harms of dangerous social media algorithms.

Type
Articles
Copyright
© 2024 The Author(s)

Access options

Get access to the full version of this content by using one of the access options below. (Log in options will check for institutional or personal access. Content may require purchase if you do not have access.)

References

1 Complaint at para. 166, Spence v. Meta Platforms, Inc., (N.D. Cal. 2022) (No. 22CV03294), 2022 WL 2101825, at *135 [hereinafter Spence Complaint].

2 Id. at para. 171.

3 Alexis downloaded Instagram to an electronic tablet first and then later, in 2014, to her cell phone. Id. at para. 171, 189(g).

4 Id. at para. 171.

5 Id. at para. 192–195.

6 Id. at para. 187.

7 Id. at para. 153.

8 Id. at para. 170.

9 Id. at para. 204, 207.

10 Id. at para. 204–205.

11 Id. at para. 204–206.

12 Id. at para. 216.

13 Id. at para. 36-41.

14 Tawnell D. Hobbs et al., ‘The Corpse Bride Diet’: How TikTok Inundates Teens with Eating-Disorder Videos, Wᴀʟʟ Sᴛ. J. (Dec. 17, 2021, 10:45 AM), https://www.wsj.com/articles/how-tiktok-inundates-teens-with-eating-disorder-videos-11639754848 [https://perma.cc/9RAH-NXB5].

15 Id.

16 Press Release, TikTok bombards teens with self harm and eating disorder content within minutes of joining the platform, Cᴇɴᴛᴇʀ ғᴏʀ Cᴏᴜɴᴛᴇʀɪɴɢ Dɪɢɪᴛᴀʟ Hᴀᴛᴇ (Dec. 15, 2022), https://counterhate.com/blog/tiktok-bombards-teens-with-self-harm-and-eating-disorder-content-within-minutes-of-joining-the-platform/ [https://perma.cc/9RNP-PGYN0].

17 Id.

18 Id.

19 Id.

20 Spence Complaint at para. 33-41.

21 Amanda Raffoul et al., Estimated Social Media Platform Revenue from U.S. Children [in preparation] (2023).

22 Communications Decency Act, 47 U.S.C. §230 (2018).

23 Esteban Ortiz-Ospina, The Rise of Social Media, Oᴜʀ Wᴏʀʟᴅ ɪɴ Dᴀᴛᴀ (Sept. 18, 2019), https://ourworldindata.org/rise-of-social-media [https://perma.cc/5PS5-9J29].

24 Mᴀʀʏᴠɪʟʟᴇ Uɴɪᴠᴇʀsɪᴛʏ, The Evolution of Social Media: How Did It Begin, and Where Could It Go Next?, https://online.maryville.edu/blog/evolution-social-media/#:~:text=In%201987%2C%20the%20direct%20precursor,social%20media%20platform%20was%20launched [https://perma.cc/HQ9N-B6J9] (last visited July 8, 2022).

25 Ortiz-Ospina, supra note 23.

26 Id.

27 Emily Vogels, et al., Teens, Social Media and Technology 2022, Pew Research Center (August 2022), https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022 [https://perma.cc/243K-8VY6].

28 Complaint at 11, Rodriguez v. Meta Platforms, Inc., No. 3:22-cv-00401 (N.D. Cal. Jan. 20, 2022), 2022 WL 190807, at *1 [hereinafter Rodriguez Complaint].

29 Id. at 61.

30 Id. at 54.

31 Id. at 60..

32 Id. at 7.

33 Complying with COPPA: Frequently Asked Questions, Fed. Trade. Commn (July, 2020), https://www.ftc.gov/business-guidance/resources/complying-coppa-frequently-asked-questions [https://perma.cc/2UER-MSPU].

34 15 U.S.C. § 6502(a)(1).

35 15 U.S.C. §6501(8).

36 15 U.S.C. § 6502(a)(1).

37 Per the social media platforms terms of use, Facebook and Instagram, under the Meta umbrella, both require users to be at least thirteen years old to sign up for an account. Terms of Service, Facebook, https://www.facebook.com/terms.php [https://perma.cc/B9F5-J4PU] (last visited Apr. 17, 2023); Terms of Use, Instagram, https://help.instagram.com/581066165581870 [https://perma.cc/6DX3-BZMD] (last visited Apr. 17, 2023). To sign up for a Snapchat account or Twitter account, both platforms also require a user must be at least thirteen years old. Snap Inc. Terms of Service, Snap Inc. (Nov. 15, 2021), https://snap.com/en-US/terms [https://perma.cc/6NNQ-6VUD]; Terms of Service, Twitter (June 10, 2022), https://twitter.com/en/tos [https://perma.cc/H54J-UG5P]. TikTok requires a new user to pass through an age gate to guide that user into the right TikTok experience. TikTok for Younger Users, TikTok (Dec. 13, 2019), https://newsroom.tiktok.com/en-us/tiktok-for-younger-users [https://perma.cc/XHM3-WWKE] TikTok, in collaboration with the Digital Wellness Lab at Boston Children’s Hospital, introduced a 60-minute daily time limit for United States users between the ages thirteen to seventeen in March 2023. If the screen limit is reached, teen users are prompted to enter a passcode to extend their screen time on the app; however, this screen limit feature can be disabled entirely or continuously extended. If a user is under thirteen years old in the United States, they will be placed into the TikTok for Younger Users experience, which has additional privacy and safety protections designed specifically for this audience. For younger users, a 60-minute daily screen limit is applied but requires a parent or guardian to enter the passcode to enable an additional 30 minutes of watch time. Again, this additional screen time can be continuously extended. See Cormac Keenan, New Features for Teens and Families on TikTok, TikTok (Mar. 1, 2023), https://newsroom.tiktok.com/en-us/new-features-for-teens-and-families-on-tiktok-us [https://perma.cc/69VT-KXJM]. Other countries, like China, have stricter time restrictions for teen users.

38 See e.g., Joseph Marks, App Makers Are Scooping Up Kids’ Data With Few Real Checks, Wᴀsʜɪɴɢᴛᴏɴ Pᴏsᴛ (June 9, 2022, 8:12 AM), https://www.washingtonpost.com/politics/2022/06/09/app-makers-are-scooping-up-kids-data-with-few-real-checks/ [https://perma.cc/Y4AF-TVRA]; Geoffrey A. Fowler, Your Kids’ Apps Are Spying on Them, Wᴀsʜɪɴɢᴛᴏɴ Pᴏsᴛ (June 9, 2022, 8:00 AM), https://www.washingtonpost.com/technology/2022/06/09/apps-kids-privacy// [https://perma.cc/983Q-WVVA].

39 See Jackie Snow, Why Age Verification Is So Difficult for Websites, Wall Street Journal (Feb. 27, 2022, 8:00 AM), https://www.wsj.com/articles/why-age-verification-is-difficult-for-websites-11645829728 [https://perma.cc/RE9R-AZM8].

40 See Kaitlin Woolley & Marissa A. Sharif, The Psychology of Your Scrolling Addiction, Harv. Bus. Rev. Jan. 31, 2022, https://hbr.org/2022/01/the-psychology-of-your-scrolling-addiction [https://perma.cc/6GLB-PZTM].

41 For example, on a webpage with a news article about running, behavioral advertisements would be based on the user’s web-history. Perhaps the user has been frequently reading articles about how to lose weight and now receives a targeted ad on the article about running, detailing how many miles a day they need to run to lose a certain amount of weight. Under COPPA, this kind of targeted advertising is not allowed for children under thirteen without verifiable parental consent.

42 See Jonathan Mayer, Do Not Track Is No Threat To Ad-Supported Businesses, The Center for Internet and Society: Blog (Jan 20, 2011, 2:12 AM), https://cyberlaw.stanford.edu/blog/2011/01/do-not-track-no-threat-ad-supported-businesses [https://perma.cc/3P3Z-5LAX].

43 For example, in a news article about running, a contextual advertisement could be an ad for running shoes.

44 Marks, supra note 38.

45 Mobile Apps: Google vs. Apple COPPA Scorecards (Children’s Privacy), Pɪxᴀʟᴀᴛᴇ 1, 1 (2022), https://www.pixalate.com/hubfs/Reports_and_Documents/Mobile%20Reports/2022/App%20Reports/Active%20Apps/Child-Directed%20Apps/Q1%202022%20-%20Apple%20vs.%20Google%20COPPA%20Scorecard%20Report%20-%20Pixalate.pdf [hereinafter Pɪxᴀʟᴀᴛᴇ].

46 “Pixalate used automated processing derived from a combination of signals (which at times is coupled with human intervention) to determine if an app is likely to be child-directed, including the app’s category, sub-category, content rating, and contextual signals (specifically, child-related keywords in app’s title or the app’s description).” Id.

47 Id. at 3.

48 15 U.S.C. § 6501(8).

49 “Pixalate calculates estimated programmatic ad spend through statistical models that incorporate programmatic monthly active users (MAU), the average session duration per user, the average CPM for the category of a given app, and ad density.” Pɪxᴀʟᴀᴛᴇ, supra note 45, at 16.

51 See e.g., Ashley Johnson, AI Could Make Age Verification More Accurate and Less Intrusive, Info. Tech. & Innovation Found. (Apr. 5, 2023) https://itif.org/publications/2023/04/05/ai-could-make-age-verification-more-accurate-and-less-invasive/ [https://perma.cc/W2WA-D5U5].

52 Press Release, Fed. Rad Comm’n, FTC Report Warns About Using Artificial Intelligence to Combat Online Problems (June 16, 2022), https://www.ftc.gov/news-events/news/press-releases/2022/06/ftc-report-warns-about-using-artificial-intelligence-combat-online-problems?utm_source=govdelivery [https://perma.cc/NMD8-GV92].

53 Id.

54 Id.

55 15 U.S.C. § 6501(1).

56 Sen. Richard Blumenthal (D-CT) and Sen. Marsha Blackburn (R-TN) introduced The Kids Online Safety Act (KOSA), which would have given parents and users under seventeen the ability to opt out of algorithmic recommendations, prevent third parties from viewing a minor’s data, and limit the time kids spend on a platform.

57 [ADD INFRA CITE]

58 See Hilary Andersson, Social Media Apps are ‘Deliberately’ Addictive to Users, BBC (July 4, 2018) https://www.bbc.com/news/technology-44640959 [https://perma.cc/6SYB-QTHL].

59 See Christian Montag et al., Addictive Features of Social Media/Messenger Platforms and Freemium Games Against the Background of Psychological and Economic Theories, 16 Intl J. Envt Rsch. & Pub. Health 1, 4–6 (2019); Marco Zenone et al., The Paradoxical Relationship Between Health Promotion and the Social Media Industry, Health Promotion Prac. 1, 1–2 (2021); Thomas Mildner & Gian-Luca Savino, Ethical User Interfaces: Exploring the Effects of Dark Patterns on Facebook, CHI Conf. on Hum. Factors Computing Sys. Extended Abstracts 1, 2 (2021), https://dl.acm.org/doi/pdf/10.1145/3411763.3451659.

60 Betul Keles et al., A Systematic Review: The Influence of Social Media on Depression, Anxiety and Psychological Distress in Adolescents, 25 Intl J. Adolescence & Youth 79, 84–86 (2020); Amy Orben, Teenagers, Screens and Social Media: A Narrative Review of Reviews and Key Studies, 55 Soc. Psychiatry and Psychiatric Epidemiology 407, 408–11 (2020); Candice Odgers & Michaeline Jensen, Annual Research Review: Adolescent Mental Health in the Digital Age: Facts, Fears, and Future Directions, 61 J. Child Psych. & Psychiatry 336, 337–41 (2020); Laura Vandenbosch et al., Social Media and Body Image: Recent Trends and Future Directions, 45 Current Op. Psych. 1, 2–3 (2022); Elizabeth Ivie et al., A Meta-Analysis of the Association Between Adolescent Social Media Use and Depressive Symptoms, 275 J. Affective Disorders 165, 168–71 (2020); Alyssa N. Saiphoo & Zahra Vahedi, A Meta-Analytic Review of the Relationship Between Social Media Use and Body Image Disturbance, 101 Computs. Hum. Behav. 259, 264–67 (2019); Jenna Course-Choi & Linda Hammond, Social Media Use and Adolescent Well-Being: A Narrative Review of Longitudinal Studies, 24 Cyberpsychology, Behav., & Soc. Networking 223, 227–232 (2021); Samantha Tang et al., The Relationship Between Screen Time and Mental Health in Young People: A Systematic Review of Longitudinal Studies, 86 Clinical Psych. Rev. 1, 9 (2021); Sophia Choukas-Bradley et al., The Perfect Storm: A Developmental-Sociocultural Framework for the Role of Social Media in Adolescent Girls’ Body Image Concerns and Mental Health, 25 Clinical Child & Fam. Psych. Rev. 681, 685–91 (2022); Ilaria Cataldo et al., Social Media Usage and Development of & Disorders in Childhood and Adolescence: A Review, 11 Frontiers Psychiatry eCollection: 1, 4-8 (2021); Patti M. Valkenburg et al., Social Media Use and Its Impact on Adolescent Mental Health: An Umbrella Review of the Evidence, 44 Curr. Opin. Psychol. 58, 59–60 (2022); Bohee So & Ki Han Kwon, The Impact of Thin-Ideal Internalization, Appearance Comparison, Social Media Use on Body Image and Eating Disorders: A Literature Review, 20 J. Evid.-Based Soc. Work. 55, 58–62 (2023).

61 Keles, supra note 60, at 88.

62 Id. at 80-81, 88.

63 Id. at 88.

64 See e.g., Course-Choi & Hammond, supra note 60.

65 Sᴀʀᴀʜ Gʀᴏɢᴀɴ, Bᴏᴅʏ Iᴍᴀɢᴇ: Uɴᴅᴇʀsᴛᴀɴᴅɪɴɢ Bᴏᴅʏ Dɪssᴀᴛɪsғᴀᴄᴛɪᴏɴ ɪɴ Mᴇɴ, Wᴏᴍᴇɴ, ᴀɴᴅ Cʜɪʟᴅʀᴇɴ 4 (2nd ed. 2008).

66 Francesca Ryding & Daria Kuss, The Use of Social Networking Sites, Body Image Dissatisfaction, and Body Dysmorphic Disorder: A Systematic Review of Psychological Research, 9 Psych. Popular Media 412, 430 (2020).

67 Janet Treasure et al., Eating Disorders, 395 Lancet 899, 899 (2020).

68 Zachary J. Ward et al., Estimation of Eating Disorders Prevalence by Age and Associations with Mortality in a Simulated Nationally Representative US Cohort. 2 JAMA Network Open 1, 1 & 7 (2019).

69 See e.g., Ryding & Kuss, supra note 66.

70 Christie N. Scollon, Research Designs, in R. Biswas-Diener & E. Diener, Noba Textbook Series: Psychology (2023), available at https://nobaproject.com/modules/research-designs.

71 Jolanda Veldhuis et al., Negotiated Media Effects. Peer Feedback Modifies Effects of Media’s Thin-Body Ideal on Adolescent Girls, 73 Appetite 172, 176–78 (2014).

72 Renee Engeln et al., Compared to Facebook, Instagram Use Causes More Appearance Comparison and Lower Body Satisfaction in College Women, 34 Body Image 38, 41 (2020).

73 Id. 41 (2020).

74 Id. at 41-42.

75 Ciera Kirkpatrick & Sungkyoung Lee, Effects of Instagram Body Portrayals on Attention, State Body Dissatisfaction, and Appearance Management Behavioral Intention, Health Commucn 1, 5–6 (2021).

76 State body dissatisfaction refers to a state of being or how someone feels in a particular moment, as opposed to trait body dissatisfaction, which is a more consistent and stable component of one’s personality. Thomas F. Cash et al., Beyond Body Image as a Trait: The Development and Validation of the Body Image States Scale, 10 Eating Disorders 103, 103–04 (2002).

77 Kirkpatrick & Lee, supra note 75.

78 Id.

79 The study used previously validated instruments including the Patient Health Questionnaire-8 (PHQ-8) to measure depressive symptoms and the General Anxiety Disorder Scale-7 (GAD-7) to measure anxiety symptoms.

80 Jeffrey Lambert et al. Taking a One-Week Break from Social Media Improves Well-Being, Depression, and Anxiety: A Randomized Controlled Trial. 25 Cyberpsychology Behav. Soc. 287, 290–291 (2022).

81 Vandenbosch, supra note 47 at 186.

82 The researchers assessed frequency of general comparisons, social comparisons, and appearance comparisons using nine survey questions with response options along a five-point Likert scale (e.g., answering “1=strongly disagree” to “5= strongly agree” in response to questions such as “I often compare myself with others on social media’ and ‘I often think that others are having a better life than me’).

83 Hannah K. Jarman et al., Direct and Indirect Relationships Between Social Media Use and Body Satisfaction: A Prospective Study Among Adolescent Boys and Girls, New Media & Socy 1, 11–12 (2021).

84 Sarah M. Coyne et al., Suicide Risk in Emerging Adulthood: Associations with Screen Time over 10 Years, 50 J. Youth & Adolescence 2324, 2326-27 (2021).

85 Id. at 2334.

86 Id. at 2328.

87 The study team used the General Health Questionnaire (GHQ12) to measure mental health. It is a twelve-item scale where a score of three or higher signifies psychological distress. Russell Viner et al., Roles of Cyberbullying, Sleep, and Physical Activity in Mediating the Effects of Social Media Use on Mental Health and Wellbeing Among Young People in England: A Secondary Analysis of Longitudinal Data, 3 Lancet Child & Adolescent Health 685, 685 (2019).

88 Id. at 691.

89 Anne J. Maheux et al., Longitudinal Associations Between Appearance-related Social Media Consciousness and Adolescents’ Depressive Symptoms, 94 J. Adolescence 264, 266 (2022).

90 Federica Pedalino & Anne-Linda Camerini, Instagram Use and Body Dissatisfaction: The Mediating Role of Upward Social Comparison with Peers and Influencers Among Young Females, 19 Intl J. Envt Rsch. Pub. Health 1, 7 (2022)

91 .Id. at 7.

92 J. Kevin Thompson & Eric Stice, Thin-Ideal Internalization: Mounting Evidence for a New Risk Factor for Body-Image Disturbance and Eating Pathology, 10 Current Directions in Psych. Sci. 181, 181 (2001).

93 Veldhuis, supra note 71 at 173, 176–79; Gemma López-Guimerà et al., Influence of Mass Media on Body Image and Eating Disordered Attitudes and Behaviors in Females. A Review of Effects and Processes, 13 Media Psych. 387, 401–02 (2010).

94 Veldhuis, supra note 71. In the Veldhuis 2014 study, shame was assessed with the questions: “(1) I feel ashamed of myself when I haven’t made an effort to look my best; (2) I feel like I must be a bad person when I don’t look as good as I could; (3) I would be ashamed for people to know what I really weigh; (4) when I’m not exercising enough, I question whether I am a good person; and (5) when I’m not the size I think I should be, I feel ashamed.” Sara M. Lindberg et al., A Measure of Objectified Body Consciousness for Preadolescent and Adolescent Youth, 30 Psych. Women Q. 65, 69 (2006).

95 Ciara Mahon & David Hevey, Processing Body Image on Social Media: Gender Differences in Adolescent Boys’ and Girls’ Agency and Active Coping, 12 Frontiers Psych. eCollection 626763, 8 (2021); Marika Skowronski et al., Links Between Exposure to Sexualized Instagram Images and Body Image Concerns in Girls and Boys, 34 J Media Psych. eCollection 55, 59 (2022); Illyssa Salomon & Christia Spears Brown, The Selfie Generation: Examining the Relationship Between Social Media Use and Early Adolescent Body Image, 39 J Early Adolescence 539, 548–52 (2022).

96 Social Media and Youth Mental Health: The US Surgeon General’s Advisory, (May 2023) https://www.hhs.gov/sites/default/files/sg-youth-mental-health-social-media-advisory.pdf [hereinafter US Surgeon General’s Advisory].

97 Leah H. Somerville, The Teenage Brain: Sensitivity to Social Evaluation, 22 Cᴜʀʀᴇɴᴛ Dɪʀᴇᴄᴛɪᴏɴs Psʏᴄʜ. Sᴄɪ. 121, 122 (2013).

98 US Surgeon General’s Advisory, supra note 73.

99 Id.

100 B.J. Casey et al., The Adolescent Brain, 1124 Aɴɴᴀʟs N.Y. Aᴄᴀᴅ. Sᴄɪs. 111, 116 (2008).

101 Id. at 117; Somerville, supra note 97, at 122.

102 Paige Ethridge et al., Neural Responses to Social and Monetary Reward in Early Adolescence and Emerging Adulthood, 54 Psʏᴄʜᴏᴘʜʏsɪᴏʟᴏɢʏ 1786, 1792–93 (2017).

103 Lauren E. Sherman et al., The Power of the Like in Adolescence: Effects of Peer Influence on Neural and Behavioral Responses to Social Media, 27 Psʏᴄʜ. Sᴄɪ. 1027, 1031 (2016); see also US Surgeon General’s Advisory, supra note 73.

104 Mara van der Meulen et al., Brain Activation Upon Ideal-body Media Exposure and Peer Feedback in Late Adolescent Girls, 17 Cᴏɢɴɪᴛɪᴠᴇ, Aғғᴇᴄᴛɪᴠᴇ, & Bᴇʜᴀᴠ. Nᴇᴜʀᴏsᴄɪ. 712, 720 (2017).

105 See Brooke Auxier & Monica Anderson. Social Media Use in 2021. Pew Research Center (Apr. 7, 2021), https://www.pewresearch.org/internet/2021/04/07/social-media-use-in-2021/; Orben, supra note 47 at 411.

106 Yolanda Reid Chassiakos et al., Children and Adolescents and Digital Media, 138(5) Am. Acad. Pediatrics (Nov. 1, 2016); Marisa Meyer et al., Advertising in Young Children’s Apps: A Content Analysis, 40(1) J. Dev.’l & Behav. Pediatrics 32, 38 (2019).

107 Caitlin R. Costello et al., Adolescents and Social Media: Privacy, Brain Development, and the Law, 44(3) J. Am. Acad. Psychiatry & L. 313, 313 (2016).

108 Auxier & Anderson, supra note 105; Victoria Rideout et al., The Common Sense Census: Media Use by Tweens and Teens, 2021, Common Sense Media (Mar. 9, 2022), https://www.commonsensemedia.org/research/the-common-sense-census-media-use-by-tweens-and-teens-2021 [https://perma.cc/588T-XFCE].

109 Amanda Raffoul et al., Social Media Platforms Generate Billions of Dollars in Revenue from U.S. Youth: Findings from a Simulated Revenue Model, 18 PLoSONE 12 (2023), https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0295337 [https://perma.cc/VLL3-DLZK].

110 See, e.g., Jenkins v. Georgia, 418 U.S. 153 (1974) (discussing high standards needed for sexually explicit content to reach levels of obscenity unprotected by the First Amendment.).

111 Victoria L. Killion, The First Amendment: Categories of Speech, Congressional Rsch. Serv., IF11072, https://crsreports.congress.gov/product/pdf/IF/IF11072.

112 An example of a constitutional content-neutral law would be a law disallowing anyone to use a bullhorn to say anything in a public square after 8 p.m. because it could disrupt sleep and the quiet evening solitude for those nearby. The law does not restrict speech based on its content, rather it restricts any speech based on the disruption it could cause to those trying to enjoy quiet and peaceful late evening hours. In contrast, a law that would restrict someone from using a bullhorn in the town square to announce the strengths of a political candidate running for town council but allow someone to use a bullhorn to announce an upcoming performance of a play at a local theater would be a content-based law and would be unconstitutional. Id.

113 Some states have attempted blanket crackdowns on social media platforms and have faced immediate pushback. For instance, Montana Governor Greg Gianforte, signed a bill on May 17, 2023, prohibiting individuals from using or downloading TikTok in the state of Montana. Any entity, defined as an app store or TikTok, faces a $10,000 penalty for each time a user downloads, accesses, or is able to access TikTok. An additional $10,000 penalty is added for each day the violation continues. The law does not impose fines on individual Tik Tok users. The ban will be void if TikTok is acquired by a country that is not incorporated in a country “designated as foreign adversary.” See S.B. 419, Gen. Sess. (Mont. 2023). It is unclear how Montana would enforce the law. While many members of Congress expressed their wariness regarding TikTok and the mental health of teen users at the March 2023 Congressional Hearings, Montana’s law is directed at privacy and security concerns involving the Chinese Communist Party. Id. This ban is currently the most extreme prohibition of the app in the United States and faced immediate legal challenges regarding its feasibility and constitutionality. TikTok filed suit just days after the Montana law was adopted, alleging that the ban is “extreme” and violates the First Amendment, as well as other federal laws. The social media company claims concerns that the Chinese government could access the data of U.S. TikTok users — which are a key motivation behind the ban — are “unfounded.” See Clare Duffy, Tik Tok sues Montana over new law banning the app, CNN Business, (May 23, 2023, 5:31 AM), https://www.cnn.com/2023/05/22/tech/tiktok-montana-lawsuit/index.html [https://perma.cc/VLB9-W2D8]. Further, NetChoice, a tech trade group that includes Google, Meta and TikTok, sued the state of Arkansas in June 2023 claiming the state’s newly passed Social Media Safety Act is unconstitutional. Netchoice asserted the law allegedly treads on First Amendment free speech rights by making users hand over private data to access social networks. It also asserts the Act hurts privacy and safety by making internet companies rely on a third-party service to store and track kids’ data. See John Fingas, Tech Firms Sue Arkansas Over Social Media Age Verification Law. (June 30, 2023), https://www.engadget.com/tech-firms-sue-arkansas-over-social-media-age-verification-law-180002953.html [https://perma.cc/5DE5-AUWH].

114 Alexander S. Gillis, Definition: Algorithm, Tᴇᴄʜ Tᴀʀɢᴇᴛ (May 2022), https://www.techtarget.com/whatis/definition/algorithm [https://perma.cc/R6YA-4CLE] (“An algorithm is a procedure used for solving a problem or performing a computation. Algorithms act as an exact list of instructions that conduct specified actions step-by-step in either hardware- or software-based routines. Algorithms are widely used throughout all areas of IT. They are the building blocks for programming, and they allow things like computers, smartphones, and websites to function and make decisions. In mathematics and computer science, an algorithm usually refers to a small procedure that solves a recurrent problem. Algorithms are also used as specifications for performing data processing and play a major role in automated systems. An algorithm could be used for sorting sets of numbers or for more complicated tasks, like recommending user content on social media. Algorithms typically start with initial input and instructions that describe a specific computation. When the computation is executed, the process produces an output … There are several types of algorithms, all designed to perform different tasks, including a search engine algorithm, encryption algorithm, greedy algorithm, recursive algorithm, backtracking algorithm, divide-and-conquer algorithm, divide and conquer algorithm, dynamic programming algorithm, brute-force algorithm, sorting algorithm, hashing algorithm, randomized algorithm, etc.”)

115 Veronica Balbuzanova, First Amendment Considerations in the Federal Regulation of Social Media Networks’ Algorithmic Speech, Part I, Am. Bar Assn (Jan. 29, 2021), https://www.americanbar.org/groups/litigation/committees/privacy-data-security/articles/2021/first-amendment-social-media-algorithmic-speech-part-1/ [https://perma.cc/Q8W5-6B5L].

116 Id. See, e.g., Universal City Studios, Inc. v. Corley, 273 F.3d 429, 449 (2d Cir. 2001) (holding that “computer code, and computer programs constructed from code can merit First Amendment protection”); Johnson Controls v. Phoenix Control Sys., 886 F.2d 1173, 1175 (9th Cir. 1989) (holding that “[s]ource and object code, the literal components of a program, are consistently held protected by a copyright on the program… Whether the non-literal components of a program, including the structure, sequence and organization and user interface, are protected depends on whether on the particular facts of each case, the component in question qualifies as an expression of an idea, or an idea itself”); Green v. United States DOJ, 392 F. Supp. 3d 68 (D.D.C. 2019); Bernstein v. U.S. Dep’t of State, 922 F. Supp. 1426, 1436 (N.D. Cal. 1996) (holding that “copyright law also supports the ‘expressiveness’ of computer programs”).

117 Balbuzanova, supra note 110. See, e.g., e-ventures Worldwide LLC v. Google, Inc., 188 F. Supp. 3d 1265 (M.D. Fla. 2016); Zhang v. Baidu.Com, Inc., 10 F. Supp. 3d 433 (S.D.N.Y. 2014); Langdon v. Google, Inc., 474 F. Supp. 2d 622 (D. Del. 2007); Kinderstart.Com, LLC v. Google, Inc., No. CO6-2057KF(RS), 2007 U.S. Dist. LEXIS 22637 (N.D. Cal. Mar. 16, 2007); Search King, Inc. v. Google Tech., Inc., No. CIV-02-1457-M, 2003 U.S. Dist. LEXIS 27193 (W.D. Okla. May 27, 2003).

118 Balbuzanova, supra note 110.

119 Id.

120 Veronica Balbuzanova, First Amendment Considerations in the Federal Regulation of Social Media Networks’ Algorithmic Speech, Part II, Am. Bar Assn (Feb. 8, 2021), https://www.americanbar.org/groups/litigation/committees/privacy-data-security/articles/2021/first-amendment-social-media-algorithmic-speech-part-11/ [https://perma.cc/2PD7-PS8B].

121 Id.

122 Ashley Johnson & Daniel Castro, Overview of Section 230: What It Is, Why it Was Created, and What It Has Achieved, Info. Tech. & Innovation Found. (Feb. 22, 2021), https://itif.org/publications/2021/02/22/overview-section-230-what-it-why-it-was-created-and-what-it-has-achieved/ [https://perma.cc/N3X4-Z3TH].

123 Id.; see also Doe v. MySpace, 528 F.3d 413 (5th Cir. 2008).

124 Michael D. Smith & Marshall Van. Alstyne, It’s Time to Update Section 230, Harvard Bus. Rev. (Aug. 12, 2021), https://hbr.org/2021/08/its-time-to-update-section-230 [https://perma.cc/YX7E-K95N].

125 Johnson, supra note 116. Numerous bills have been proposed aiming to amend Section 230 of the CDA including The Biased Algorithm Deterrence Act of 2019; Protecting Americans from Dangerous Algorithms Act; Justice Against Malicious Algorithms Act of 2021; Federal Bich Tech Tort Act, Safeguarding Against Fraud, Exploitation, Threats, Extremism, and Consumer Harms Act (SAFE TECH Act); Stop Shielding Culpable Platforms Act; and Social Media NUDGE Act. H.R. 492, 116th Cong. (2019-2020); H.R. 2154, 117th Cong. (2021-2022); H.R. 5596, 117th Cong. (2021-2022); H.R.3421, 117th Cong. (2021-2022); H.R. 2000, 117th Cong. (2021-2022); S.3608, 117th Cong. (2021-2022).

126 Id.

127 See Robert Barnes & Cat Zakrzewski, Supreme court Rules for Google, Twitter, on terror-related content, Wash. Post (May 18, 2023, 11:04 AM), https://www.washingtonpost.com/politics/2023/05/18/gonzalez-v-google-twitter-section-230-supreme-court/ [https://perma.cc/53Q3-XMF4]. (“Tech companies and their surrogates celebrated the ruling, which followed extensive lobbying and advocacy campaigns to defend Section 230 in Washington. Changes to the law, they said, could open a floodgate of litigation that would quash innovation and have wide-ranging effects on the technology that underlies almost every interaction people have online, from innocuous song suggestions on Spotify to prompts to watch videos about conspiracy theories on YouTube.”).

128 See Gonzalez v. Google LLC, 2 F.4th 871, 883 (9th Cir. 2021).

129 The fact that these algorithms matched some ISIS content with some users thus does not convert defendants’ passive assistance into active abetting. Twitter, Inc. v. Taamneh, 143 S.Ct. 1206, 1209 (2023).

130 See id. at 1226 (explaining that recommendation algorithms do not go “beyond passive aid and constitute active, substantial assistance”). Plaintiffs also alleged that the Defendants’ knowledge of ISIS content and failure to screen the publication of such content rose to the level of “aiding and abetting” ISIS; however, the Court disagreed. See id. at 1222–24.

131 See Gonzalez v. Google LLC, 143 S. Ct. 1191 (2023).

132 See Patrcik Garrity et al., American Student Nohemi Gonzalez Identified As Victim in Paris Massacre, NBC News (Nov. 14, 2015, 1:50 PM,), https://www.nbcnews.com/storyline/paris-terror-attacks/american-student-nohemi-gonzalez-idd-victim-paris-massacre-n463566 [https://perma.cc/525N-NJA7].

133 See Gonzalez, 2 F.4th at 882.

134 See id. at 880. Plaintiffs further allege that defendants should be directly liable for committing acts of international terrorism, and for conspiring with, and aiding and abetting ISIS’s acts of international terrorism because the platform’s algorithm directed ISIS videos to users and recommended ISIS content to users. Id. at 881. Ultimately, the Ninth Circuit held in favor of the defendants because the plaintiffs “did not plausibly allege” that Google, Twitter, and Facebook’s actions qualified as an act of international terrorism and conspiracy or aiding and abetting. Id. at 913.

135 See In re Apple Inc. Litig., 2022 U.S. Dist. LEXIS 159613 *1, *20 (N.D. Cal. 2022).

136 Id. at 72–73.

137 See id. at 74–76.

138 Id. at 72.

139 Id. at 77.

140 See id. at 76–78.

141 See id.

142 See Gonzalez, 2 F.4th at 909–913.

143 Id. at 907.

144 Twitter, 143 S. Ct. at 1209.

145 Id.

146 The court held the Gonzalez Plaintiffs’ revenue-sharing allegations were not directed to the publication of third-party information. The revenue sharing did not depend on the particular content ISIS places on Youtube; the theory is solely directed to Google’s unlawful payments of money to ISIS. Therefore, the alleged violation could be remedied without changing any of the content posted by Youtube’s users. The allegations of revenue sharing do not seek to hold Google liable for any content provided by a third-party. See Gonzalez, 2 F.4th at 913. The Supreme Court did not reject this reasoning, suggesting that a potential revenue sharing liability claim may be used in the future. See Twitter, 143 S. Ct. at 1209–1210.

147 Additionally, the plaintiffs in the Gonzalez case “did not seek review of the Ninth Circuit’s holdings regarding their revenue-sharing claims,” so the Supreme Court did not address this issue in its opinion of Gonzalez. See Gonzalez, 143 S. Ct. at 1192.

148 With the exception of demographic and behavioral advertising targeted at minors under the age of thirteen which is in violation of COPPA, the broad use of algorithms to feed content to minors, including content that can result in harm, is not illegal.

149 This article does not explore the legal remedies that provide a less examined solution to preventing harm inflicted on minors by social media platforms, but they are mentioned here. One remedy is taxation. The Maryland Digital Advertising Gross Revenues Tax is the nation’s first tax on the revenue from digital advertisements that are sold by social media platforms displayed inside the state of Maryland. The tax went into effect on January 1, 2022 and was projected to gain $250 million dollars in its first year of implementation. This tax has been challenged on constitutional grounds in federal and state court. Similar tax legislation has been introduced in five other states. See David McCabe, Maryland Approves Country’s First Tax on Big Tech’s Ad Revenue, New York Times (Feb. 12, 2021), https://www.nytimes.com/2021/02/12/technology/maryland-digital-ads-tax.html [https://perma.cc/THE8-WKVA].

150 Another potential legal remedy is the withholding of government contracts from platforms which fail to uphold and implement standards to keep their platforms safe for children and teens users. The San Francisco Green Building Code is an example. Under the Code, developers of buildings who do not comply with the standards that ensure that buildings are healthy and sustainable places to live and work will not be afforded government contracts. Ord. 3-20, File No. 190974 (2020). This same practice could be adopted to withhold government contracts from social media companies which fail to provide a safe platform to teens and children.

151 15 U.S.C. § 45(a)(1).

152 15 U.S.C. § 45(n).

153 Chairman James C. Miller III, FTC Policy Statement on Deception, Fᴇᴅᴇʀᴀʟ Tʀᴀᴅᴇ Cᴏᴍᴍɪssɪᴏɴ 1, 1–2, (Oct. 14, 1983), https://www.ftc.gov/system/files/documents/public_statements/410531/831014deceptionstmt.pdf [hereinafter FTC Policy Statement on Deception].

154 What the FTC Does, Fed. Trade Commn, https://www.ftc.gov/news-events/media-resources/what-ftc-does [https://perma.cc/4JNA-B6ES] (last visited Apr. 17, 2023).

155 Federal Trade Commission v. LeadClick Media, LLC, 838 F.3d 158, 168 (2d Cir. 2016) (quoting F.T.C. v. Verity Intern., Ltd., 443 F.3d 48, 63 (2d Cir. 2006)).

156 See generally Ryan Strasser et al., State AGs Lead the Way in False Advertising Enforcement, Troutman Pepper Law Firm (Feb. 2, 2022), https://www.troutman.com/insights/state-ags-lead-the-way-in-false-advertising-enforcement.html [https://perma.cc/BKZ6-5R3T].

157 Id.

158 Id. Today, each of the fifty states and the District of Columbia has some form of a consumer protection law, often referred to as the state’s “Unfair and Deceptive Acts and Practices Act” (UDAP) or “Consumer Protection Act” (CPA). Generally, these state consumer protection laws prohibit deceptive practices in consumer transactions, and although the substance of the statutes varies widely from state to state, many also prohibit unfair or unconscionable practices. State UDAPs and CPAs are primarily civil statutes, but others also create criminal penalties for severe violations. Id.

159 Matthew Lewis, The Role of the Attorney General in Reforming Social Media for Children, N.Y. J. of Legis. & Pub. Pol’y (Oct. 10, 2022), https://nyujlpp.org/quorum/lewis-how-state-attorneys-general-can/ [https://perma.cc/MTG8-MM8N]. The multi-state investigation includes attorneys general offices in Massachusetts, California, Florida, Kentucky, Nebraska, New Jersey, Tennessee, and Vermont. Id. Attorneys general in forty-two states filed legal actions against Meta in October 2023 alleging it violated consumer protection laws by unfairly ensnaring children and deceiving users about the safety of its platforms. See Cecilia Kang & Natasha Singer, Meta Accused by States of Using Features to Lure Children to Instagram and Facebook, New York Times (Oct. 23, 2023), https://www.nytimes.com/2023/10/24/technology/states-lawsuit-children-instagram-facebook.html [https://perma.cc/3GH9-YXQQ].

160 Lewis, supra note 159. Congress has introduced legislation aimed at curbing alleged harms inflicted on youth by social media platforms, but it has met considerable opposition. Legal analysts suggest it will be more effective for state attorneys general to pursue claims social media companies to alleviate harms. Id.

161 Id. “State attorneys general are equipped in three ways to serve the public interest and address the harms of social media against children: (A) investigation and litigation against social media platforms; (B) advocating for policy reform in their state legislatures, Congress, and directly to platforms; and (C) educating the public. Attorneys general have broad power to subpoena documents and compel testimony by social media company executives to force disclosure on all information related to the operation of algorithms and their effect on adolescent users from social media platforms.” Id.

162 Members of the Strategic Training Initiative for the Prevention of Eating Disorders (STRIPED), who are the authors of this article, met with attorneys general offices in more than 12 states in 2022, including several involved in the multi-state investigation, to discuss the harmful effects of social media algorithms on youth, identify the economic incentives that drive social media companies to use them, and look at possible legal strategies to regulate social media platforms’ use of algorithms.

163 15 U.S.C. § 45(n).

164 FTC Policy Statement on Deception, supra note 138. A material misrepresentation or practice is defined as a misrepresentation or practice “which is likely to affect a consumer’s choice of or conduct regarding a product.” Id.

165 Allison Zakon, Optimized for Addiction: Extending Product Liability Concepts to Defectively Designed Social Media Algorithms and Overcoming the Communications Decency Act, 2020 Wis. L. Rev. 1107, 1118–19 (2020).

166 Id. at 1119-21.

167 Lemmon v. Snap, 995 F.3d 1085, 1087 (9th Cir. 2021).

168 Id. at 1093.

169 Id. at 1087.

170 Jason Ysais, Meta Platforms, Inc. and Snap, Inc. Face Wrongful Death Lawsuit for Causing the Suicide of 11-year-old Selena Rodriguez, Social Media Victims Law Center (Jan. 20, 2021), https://socialmediavictims.org/press-releases/rodriguez-vs-meta-platforms-snap-lawsuit [https://perma.cc/78TN-8QX9].

171 Id.

172 Zakon, supra note 165, at 1118–19.

173 The U.S. Surgeon General’s Advisory in 2023 suggested using a multifaceted approach, including a products liability strategy, to curb the harms caused to young people by social media. “The U.S. has often adopted a safety-first approach to mitigate the risk of harm to consumers. According to this principle, a basic threshold for safety must be met, and until safety is demonstrated with rigorous evidence and independent evaluation, protections are put in place to minimize the risk of harm from products, services, or goods. For example, the Consumer Product Safety Commission requires toy manufacturers to undergo third-party testing and be certified through a Children’s Product Certificate as compliant with the federal toy safety standard for toys intended for use by children….Given the mounting evidence for the risk of harm to some children and adolescents from social media use, a safety-first approach should be applied in the context of social media products.” US Surgeon General’s Advisory, supra note 73.

175 Id.

176 The U.S. Surgeon General recommends that social media companies “[c]onduct and facilitate transparent and independent assessments of the impact of social media products and services on children and adolescents [and] assume responsibility for the impact of products on different subgroups and ages of children and adolescents, regardless of the intent behind them.” The Surgeon General also recommends that results of independent assessments “be transparent” and that social media companies share assessment findings and underlying data with independent researchers and the public. The Surgeon General urges that platform design and algorithms should prioritize health and safety as the first principle, seek to maximize the potential benefits, and avoid design features that attempt to maximize time, attention, and engagement. Issued periodically, a US Surgeon General’s Advisory is a public statement that calls the American people’s attention to an urgent public health issue and provides recommendations for how it should be addressed. Advisories are reserved for significant public health challenges that require the nation’s immediate awareness and action. US Surgeon General’s Advisory, supra note 73.

177 Complaint at 23, Seattle School District No. 1 v. META (Case 2:23-cv-00032).

178 Id. at 1.

179 Id. at 87.

180 Id. at 73.

181 Seattle School District No. 1, Pub. Sch. Rev., https://www.publicschoolreview.com/washington/seattle-school-district-no-1/5307710-school-district [https://perma.cc/94NQ-5LRR] (last accessed Apr. 19, 2023).

182 Complaint at 74, Seattle School District No. 1 v. META (Case 2:23-cv-00032).

183 Isabel Lochman, Dexter schools sue social media giants, citing child mental health crisis, Bridge Michigan (Apr. 14, 2023), https://www.bridgemi.com/talent-education/dexter-schools-sue-social-media-giants-citing-child-mental-health-crisis [https://perma.cc/Q7E4-TZWF].

184 Id.

185 Complaint at 85, Seattle School District No. 1 v. META (Case 2:23-cv-00032).

186 RCW 7.48.120.

187 RCW 7.48.130.

188 Public nuisance, Britannica, https://www.britannica.com/topic/nuisance [https://perma.cc/6Y8X-5XXV] (last accessed Apr. 19, 2023).

189 Gene Johnson, Schools sue social media companies for targeting children, Kare (Jan. 11, 2023, 4:46 AM), https://www.kare11.com/article/news/nation-world/schools-sue-social-media-companies/507-3fcdc58b-deaa-4f84-8594-57d1cda667b0 [https://perma.cc/VM8E-WV4S].

190 Tort Law: The Rules of Public Nuisance, Law Shelf, https://lawshelf.com/shortvideoscontentview/tort-law-the-rules-of-public-nuisance [https://perma.cc/EVD3-NJD9] (last accessed Mar. 1, 2023).

192 Id.

193 Nuisance, JR Rank, https://law.jrank.org/pages/8871/Nuisance-Remedies.html [https://perma.cc/4LXR-C6P3] (last accessed Apr. 19, 2023).

194 Id.

196 Ty Roush, Juul To Pay $1.2 Billion To Settle Youth-Vaping Lawsuits, Forbes (Dec. 9, 2022, 1:28 PM) https://www.forbes.com/sites/tylerroush/2022/12/09/juul-to-pay-12-billion-to-settle-youth-vaping-lawsuits/?sh=6d8cad09345c [https://perma.cc/W529-JUYR].

197 Angelica LaVito, Lawmaker accuses Juul of illegally advertising vaping as a way to quit smoking, CNBC (Sept. 5, 2019, 3:23 PM), https://www.cnbc.com/2019/09/05/juul-accused-of-illegally-advertising-vaping-as-a-way-to-quit-smoking.html [https://perma.cc/8ZGH-57PM].

198 Roush, supra note 196.

199 Reuters, Juul agrees to pay $1.2bln in youth-vaping settlement-Bloomberg News, Reuters (Dec. 9, 2022, 11:47 AM), https://www.reuters.com/legal/juul-agrees-pay-12-bln-youth-vaping-settlement-bloomberg-news-2022-12-09/ [https://perma.cc/WCQ2-Q9NM].

200 Reuters, Juul to pay $462 million to California, New York, and other states over claims it marketed vapes to minors, NBC News (Apr. 12, 2023, 1:09 PM), https://www.nbcnews.com/health/health-news/juul-to-pay-462-million-claims-marketed-vapes-minors-rcna79375 [https://perma.cc/FY9Y-XK4X].

201 See Ananya Bhattacharya, Minnesota is trying to prove Juul got teens addicted on vaping in a first-of-its-kind trial, Quartz (Mar. 24, 2023), https://qz.com/minnesota-jull-altria-trial-public-nuisance-vaping-1850260880; Juul’s Trial in Minnesota End With a Settlement, CS News (Apr. 18, 2023), https://www.csnews.com/juuls-trial-minnesota-ends-settlement [https://perma.cc/8847-EKZQ].

202 Julian Shen-Berro, As Seattle schools sue social media companies, legal experts split on potential impact, Chalk Beat (Jan. 17, 2023, 6:00 AM), https://www.chalkbeat.org/2023/1/17/23554378/seattle-schools-lawsuit-social-media-meta-instagram-tiktok-youtube-google-mental-health [https://perma.cc/3NUE-JTNG].

203 Samantha Gross, The California Age-Appropriate Design Code Act Places New Obligations on Companies Collecting Information About Children Online, JD Supra (Feb. 24, 2023), https://www.jdsupra.com/legalnews/the-california-age-appropriate-design-1510066/ [https://perma.cc/35FZ-U3VQ].

204 See generally, Meg Crowley, California’s New Age-Appropriate Design Code Act: Violation of Free Speech?, Hastings Commcns & Ent. J. (Jan. 26, 2023), https://www.hastingscomment.org/online-content/californias-new-age-appropriate-design-code-violation-of-free-speechnbsp [https://perma.cc/BA7C-HXCK].

205 The California Age Appropriate Design Code Act, AB 2273, State Assemb. 2021-2022 Sess. (Ca. 2022) (1).

206 Id. at §1798.99.31(a)(5).

207 Id. at §1798.99.31(a)(6).

208 Id. at §1798.99.31(b)(1).

209 Id. at §1798.99.31(b)(5).

210 Id. at §1798.99.31(b)(7).

211 Id. at §1798.99.32(a). The California Data Protection Working Group will be assembled by April 1, 2023, and will sanction regulations under the Age Appropriate Design Code by April 1, 2024. Members of the Taskforce will be appointed by the California Privacy Protection Agency (CPPA). The taskforce will be “Californians with expertise in the areas of privacy, physical health, mental health, and well-being, technology, and children’s rights.” See, e.g., Chloe Altieri & Kewa Jiang, California Age-Appropriate Design Code Aims to Address Growing Concern About Children’s Online Privacy and Safety, Future of Privacy Reform (June 28, 2022), https://fpf.org/blog/california-age-appropriate-design-code-aims-to-address-growing-concern-about-childrens-online-privacy-and-safety/ [https://perma.cc/PA2F-N9KH].

212 Megan Brown et al., California Age-Appropriate Design Code Act to Impose Significant New Requirements on Businesses Providing Online Services, Products, or Features, JD Supra (last updated Sept. 19 2022). https://www.jdsupra.com/legalnews/california-age-appropriate-design-code-8105166/ [https://perma.cc/MS6B-VS8R].

213 Id. The California Code has been challenged by tech companies in court so implementation of the law may be forestalled.

214 Katie Terell Hanna, COPPA (Children’s Online Privacy Protection Act), Tech Target (Mar. 2022), https://www.techtarget.com/searchcio/definition/COPPA-Childrens-Online-Privacy-Protection-Act [https://perma.cc/ZJ3B-AP5B].

215 Musadiq Bidar, California lawmakers push ahead with sweeping children’s online privacy bill, CBS News (May 12, 2022, 1:50 PM), https://www.cbsnews.com/news/online-privacy-california-age-appropriate-design-code-teens-children/ [https://perma.cc/L828-XQA2].

216 Natasha Lomas, UK now expects compliance with children’s privacy design code, Tech Crunch (Sept. 1, 2021, 7:01 PM), https://techcrunch.com/2021/09/01/uk-now-expects-compliance-with-its-child-privacy-design-code/ [https://perma.cc/8R5N-UKLV].

217 Id.; Alex Hern, Social media giants increase global child safety after UK regulations induced, The Guardian (Sept. 5, 2021, 10:14 AM), https://www.theguardian.com/media/2021/sep/05/social-media-giants-increase-global-child-safety-after-uk-regulations-introduced [https://perma.cc/7KVJ-FC84]; Google Announcement Shows Impact of Children’s Code, 5Rights Found. (Aug. 10, 2021), https://5rightsfoundation.com/in-action/google-announcement-shows-impact-of-childrens-code.html [https://perma.cc/XX28-UZPD].

218 The California Age Appropriate Design Code Act §1798.99.30(a)(1)(A) (2022), https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=202120220AB2273 [https://perma.cc/2U4Y-W472].

219 Id. at §1798.99.30(b)(2).

220 Id. at §1798.99.30(b)(4).

221 Id. at §1798.99.31(c). The Design Code appoints the California Attorney General’s Office as the main enforcer of the state law. Similarly, the Attorney General may bring actions against businesses for unfair or deceptive practices, mirroring the claims the FTC can bring under Section 5 of the FTC Act. See Strasser et al., supra note 141.

222 Ca. AB 2273 §1798.99.30(b)(2).

223 Balbuzanova, supra note 110; Johnson, supra note 119.

224 What is the Age-Appropriate Design Code - and How is it Changing the Internet?, Parent Zone (July 27, 2022), https://parentzone.org.uk/article/what-is-the-age-appropriate-design-code [https://perma.cc/7W82-3U5U].

225 Paul Harper & Catherine Micallef, How Old Do You Have to be to Have Facebook and Instagram Account? Social Media Age Restrictions Explained, The U.S. Sun (June 8, 2022, 12:33 PM), https://www.the-sun.com/tech/289567/age-restrictions-facebook-snapchat-twitter-instagram/ [perma.cc/PY3Z-LE4M].

226 Id.

227 Ariel Fox Johnson, 13 Going on 30: An Exploration of Expanding COPPA’s Privacy Protections to Everyone, 44 Seton Hall Legis. J. 419, 448–49 (2020). Congress is currently considering legislation that would amend COPPA to strengthen protections related to the online collection, use, and disclosure of personal information of minors under age 17. Co-authored by U.S. Senators Edward Markey (D-MA) and Cassidy (D-LA), COPPA 2.0 would: (1) expand protections to teens age 13–16 by requiring their opt-in consent before data collection; (2) ban targeted advertising to all covered minors; (3) close a loophole in COPPA that allows sites and apps to turn a blind eye to young people using their services and evade compliance; (4) create an “eraser button” for parents and kids requiring companies to delete personal information of minors; 5) establish a “Digital Marketing Bill of Rights” that minimizes the amount of data collected and used on minors; and (6) enhance enforcement by establishing a Youth Privacy and Marketing Division at the Federal Trade Commission. Children and Teens Online Privacy Protection Act: Legislation to Strengthen Privacy Protections for Minors, Common Sense, https://www.commonsensemedia.org/sites/default/files/featured-content/files/coppa-2.0-one-pager-2023.pdf (last accessed on Aug. 2, 2023).

228 Introducing New Ways to Verify Age on Instagram, Instagram (June 23, 2022), https://about.instagram.com/blog/announcements/new-ways-to-verify-age-on-instagram [https://perma.cc/YHQ8-P63M].

229 Id.

230 KOSA was previously introduced by Senators Blumenthal and Blackburn in February 2022. See Blackburn, Blumenthal Introduce Bipartisan Kids Online Safety Act, Marsha Blackburn (May 2, 2023), https://www.blackburn.senate.gov/2023/5/blackburn-blumenthal-introduce-bipartisan-kids-online-safety-act [https://perma.cc/U6AJ-CKVN]. Despite a unanimous, 28-0 vote, by the Commerce Committee, the bill failed to continue in the legislative process. See id.

231 The latest version of KOSA has thirty-nine bipartisan co-sponsors and has endorsements from Common Sense Media, American Psychological Association, American Academy of Pediatrics, American Compass, Eating Disorders Coalition, Fairplay, Mental Health America, and Digital Progress Institute. See id.

232 S.B. 1409, Gen. Sess. (2023-2024).

233 See id. (requiring platforms to allow minor users the ability to access safeguards to “limit features that increase, sustain, or extend use of the covered platform by the minor, such as automatic playing of media, rewards for time spent on the platform, notifications, and other features that result in compulsive usage of the covered platform by the minor”).

234 See id.

235 See id.

236 See id.

237 LGBTQ advocates, who viewed KOSA’s language as too restrictive, voiced concern that such limitations would ultimately harm marginalized young people’s ability to learn about important information that they otherwise could not gain access to. See Lauren Feiner, Kids Online Safety Act may harm minors, civil society groups warn lawmakers, CNBC News (Nov. 28, 2022, 12:01 AM), https://www.cnbc.com/2023/05/02/updated-kids-online-safety-act-aims-to-fix-unintended-consequences.html [https://perma.cc/AZP4-XQJF].

238 The earlier version of the bill did not include these safeguards.

239 The ACLU, which was opposed to the earlier version of the bill, expressed its continued opposition to KOSA 2.0, stating that “[KOSA] would ironically expose the very children it seeks to protect to increased harm and increased surveillance.” Lauren Feiner, Lawmakers update Kids Online Safety Act to address potential harms, but fail to appease some activists, industry groups, CNBC News (May 2, 2023, 1:12 PM), https://www.cnbc.com/2023/05/02/updated-kids-online-safety-act-aims-to-fix-unintended-consequences.html [https://perma.cc/LDY7-MSPV]. (quoting ACLU Senior Policy Counsel Cody Venzke)). Additionally, NetChoice, a lobbying group for multinational technology companies including Google, Meta, TikTok and Amazon, has continued to express concern regarding “how this bill would work in practice …” as it “still requires an age verification mechanism and data collection on Americans of all ages.” See id. NetChoice has also sued California challenging its Age-Appropriate Design Code Act.

240 When this article was published in 2023, KOSA and COPPA 2.0 were passed in the U.S. Senate Committee on Commerce, Sciente and Transportation. The bills could be moved to a Senate floor vote later in 2023.

241 Larissa Sapone, Moving Fast and Breaking Things: An Analysis of Social Media’s Revolutionary Effects on Culture and its Impending Regulation, 59 Duq. L. Rev. 362, 367–69 (2021).

242 Addiction, Social Media Victims Law Center, https://socialmediavictims.org/social-media-addiction/ [https://perma.cc/A6EF-SWUS] (last accessed July 27, 2022).

243 Megan Cerullo, Mom sues Meta and Snap over her daughter’s suicide, CBS News (Jan. 21, 2022), https://www.cbsnews.com/news/meta-instagram-snap-mom-sues-after-daughter-suicide/ [https://perma.cc/BES8-3F2U]; Rodriguez Complaint, supra note 29.

244 The California Social Media Duty to Protect Children Act, AB 2408, State Assemb. 2021-2022 Sess. (Ca. 2022).

245 Id. (emphasis omitted).

246 Evan Symon, Bill to Punish Social Media Companies for Addictive Features for Minor Users Killed in Senate, California Globe (Aug. 12, 2022) https://californiaglobe.com/articles/bill-to-punish-social-media-companies-for-addictive-features-for-minor-users-killed-in-senate/ [https://perma.cc/7RCU-9PJB].

247 Id.

248 In March 2023, Utah passed two new laws to protect minors from perceived harms caused by social media. One law, entitled the Social Media Regulation Amendments, requires social media platforms to verify the ages of account holders and enforces a digital curfew, from 10:30pm to 6:30am, for teen users. This law also requires social media companies to verify the age of users. Verification procedures will be determined by the Utah Division of Consumer Protection and may not be limited to government issued identification cards. Parental consent is also required for teen users to have a social media account, and parents or guardians are granted full access to their teen’s account. See S.B. 152, Gen. Sess. (Utah 2023). Arkansas enacted a similar law, the Social Media Safety Act, in April 2023 which also requires age verification and parental consent. This law will be enforced by the Arkansas Attorney General’s Office and prohibits teen users from having a social media account without the express permission of a parent or guardian. To verify the ages of users, the Social Media Safety Act requires social media companies to use a third- party vendor to “perform reasonable age verification,” which includes checking a user’s government issued identification card or other “commercially reasonable age verification method[s].” See S.B. 396, Gen. Sess. (Arkansas 2023). While well-intentioned, these laws face criticism for invading teen privacy and freedom of speech rights. Social media companies have yet to announce any plans to challenge these new laws, but it is anticipated the laws will face future legal battles.

249 See H.B. 311, Gen. Sess. (Utah 2023). In this context, a “young person” refers to minors, anyone younger than eighteen years old.

250 Id.

251 Id.

252 Id. Similarly, a law passed recently in Arkansas entitled, the Social Media Safety Act creates a private right of action for teen users to sue social media companies for any damages incurred by their access to social media platforms without the consent of their parent or guardian. The social media companies face a penalty of $2,500 per violation, in addition to other fees and damages ordered by a court. See S.B. 396, Gen. Sess. (Arkansas 2023).

253 See H.B. 311, Gen. Sess. (Utah 2023).

254 See id. (explaining that for users under the age of 16, “there shall be a rebuttable presumption that the harm actually occurred and that the harm was caused as a consequence of using or having an account”).

255 Id.

256 Id. The Utah law states that “[a] social media company shall not be subject to a civil penalty for violating this section if the social media company, as an affirmative defense, demonstrates that the social media company: (i) instituted and maintained a program of at least quarterly audits of the social media company’s practices, designs, and features to detect practices, designs, or features that have the potential to cause or contribute to the addiction of a minor user; and (ii) corrected, within 30 days of the completion of an audit described in Subsection (3)(b)(i), any practice, design, or feature discovered by the audit to present more than a de minimus risk of violating this section.”

257 See Bryan Scott, Utah faces new lawsuit over social media restrictions for minors, Salt Lake City Tribune (Dec. 18, 2023, 9:43 p.m.), https://www.sltrib.com/news/politics/2023/12/18/utah-faces-new-lawsuit-over-social/ [https://perma.cc/Q8AJ-TLZ9] (In the suit, Netchoice claims Utah’s regulations unconstitutionally restrict the ability of minors and adults to access content that otherwise would be legal).

258 NetChoice alleges that the California Age-Appropriate Design Code Act (CAADCA) infringes on users’ privacy rights and the First Amendment. It also argues the CAADCA violates the Commerce Clause and is preempted by COPPA and Section 230. See Mot. for Prelim. Inj. at 1-7, NetChoice v. Bonta, No. 5:22-cv-08861-BLF (N.D. Cali. 2022).

259 Edward O. Wilson, Debate at the Harvard Museum of Natural History, Cambridge, Mass., (Sept. 9, 2009) https://www.oxfordreference.com/display/10.1093/acref/9780191826719.001.0001/q-oro-ed4-00016553;jsessionid=0CDC082C53C019ACD4F203281506A378 [https://perma.cc/55PX-LCAV].

260 Office of the Surgeon General (OSG). Protecting Youth Mental Health: The U.S. Surgeon Generals Advisory 9 (2021).

261 Jean Twenge et al., Increases in Depressive Symptoms, Suicide-Related Outcomes, and Suicide Rates Among U.S. Adolescents After 2010 and Links to Increased New Media Screen Time, 6 Clinical Psych. Sci. 3, 8–9 (2018).

262 Emily Vogels et al., Teens, Social Media and Technology 2022, Pew Rsch. Center (Aug. 2022), https://www.pewresearch.org/internet/2022/08/10/teens-social-media-and-technology-2022/ [https://perma.cc/H7VK-TPXP].

263 The European Union has legally mandated algorithm risk audits under the Digital Services Act passed in July 2022. It will take effect no later than January 1, 2024. The DSA imposes obligations on very large online platforms, with users in the European Union, to manage systemic risks through various means, including independent audits. It requires platforms to conduct internal annual risk assessments and implement reasonable, proportionate, and effective mitigation measures. The independent audits the DSA requires will result in publicly disclosed reports. Regulation on a Single Market for Digital Services and amending the Directive 2000/31/EC (Digital Serv. Act), Oct. 19, 2022, Eur. Parl. Doc. 2022/2065.

264 20 NYCRR 871.

265 Id. 871(a)(1), (b)(2).

266 Id. 871(b)(1)-(2).

267 This method, known as the selection rate, describes one way the impact ratio for a particular demographic category can be measured. Alternatively, the impact ratio can be measured using a scoring rate, which is applicable when an automated employment decision tool scores applicants. The scoring rate is determined by the rate at which individuals in a demographic category receive a score above the sample’s median score. Under this model, the bias audits measure the disparate impact the use of algorithms has on a specific demographic category by comparing the scoring rate of applicants from that specific demographic to those in the demographic category with the highest scoring rate. Cɪᴛʏ N.Y. Rᴜʟᴇs, tit 6, § 5-300 (2023).

268 Id.

269 Id. at § 5-301.

270 The United States Attorney’s Office for the Southern District of New York, United States Attorney Resolves Groundbreaking Suit Against Meta Platforms, Inc., Formerly Known As Facebook, To Address Discriminatory Advertising For Housing, Dᴇᴘ’ᴛ ᴏғ Jᴜsᴛ. (June 21, 2022), https://www.justice.gov/usao-sdny/pr/united-states-attorney-resolves-groundbreaking-suit-against-meta-platforms-inc-formerly [https://perma.cc/DG5N-569C].

271 Id.

272 Id. This lawsuit was based on an investigation and charge of discrimination by the Department of Housing and Urban Development, which found that all three aspects of Facebook’s ad delivery system delivered housing ads based on FHA-protected characteristics. The complaint for the case against Meta challenged three key aspects of Meta’s ad targeting and delivery system. First, the complaint alleged that “Meta enabled and encouraged advertisers to target their housing ads by relying on race, color, religion, sex, disability, familial status, and national origin to decide which Facebook users [would] be eligible, and ineligible, to receive housing ads.” Second, the complaint alleged that Meta created an ad targeting tool—the Special Ad Audience—which used an algorithm “to find Facebook users who share[d] similarities with groups of individuals selected by an advertiser using several options provided by Facebook.”# In doing this, Meta “allowed its algorithm to consider FHA-protected characteristics—including race, religion, and sex—in finding Facebook users who ‘look like’ the advertiser’s source audience.” Third, the complaint alleged that Meta’s ad delivery system used algorithms that relied, in part, “on FHA-protected characteristics—such as race, national origin, and sex—to help determine which subset of an advertiser’s targeted audience [would] actually receive a housing ad.”# In total, the complaint alleged that Meta “used these three aspects of its advertising system to target and deliver housing-related ads to some Facebook users while excluding other users based on FHA-protected characteristics.” Id.

273 Id.

274 Id.

275 Id.

276 Id. at 7.

277 Id.

278 Interview with Jacob Appel, Chief Strategist, Oneill Risk Consulting & Algorithm Auditing (June 1, 2022) (on file with Strategic Training Initiative for the Prevention of Eating Disorders legal team) [hereinafter Interview with Jacob Appel].

279 Id.

280 Id.

281 Id.

282 Id.

283 Id.

284 Id.

285 Id.

286 Emma Roth, Meta’s new ad system addresses allegations that it enabled housing discrimination, Tʜᴇ Vᴇʀɢᴇ (Jan. 9, 2023, 6:03PM), https://www.theverge.com/2023/1/9/23547191/meta-equitable-ads-system-settlement [https://perma.cc/Z4G7-JP89].

287 The United States Attorney’s Office for the Southern District of New York, United States Attorney Implements Groundbreaking Settlement With Meta Platforms, Inc., Formerly Known As Facebook, To Address Discrimination In The Delivery Of Housing Ads, Dᴇᴘ’ᴛ ᴏғ Jᴜsᴛ. (Jan. 9, 2023), https://www.justice.gov/usao-sdny/pr/united-states-attorney-implements-groundbreaking-settlement-meta-platforms-inc-formerly [https://perma.cc/LJR8-UGZE] [hereinafter Jan. US Attorney’s Office].

288 Settlement Agreement at 6, United States v. Meta Platforms, Inc., No. 1:22-cv-05187 (S.D.N.Y. June 21, 2022).

289 Interview with Jacob Appel, supra note 247.

290 Id.

291 Settlement Agreement at 9, United States v. Meta Platforms, Inc., No. 1:22-cv-05187 (S.D.N.Y. June 21, 2022).

292 Jan. US Attorney’s Office, supra note 255.

293 Id.

294 Interview with Jacob Appel, supra note 247.

295 See Suku Sukunesan, Examining the Pro-Eating Disorders Community on Twitter Via the Hashtag #proana: Statistical Modeling Approach, 8 JMIR Mᴇɴᴛᴀʟ Hᴇᴀʟᴛʜ 1, 2 (2021).

296 See, e.g., Jennifer A. Harriger, The dangers of the rabbit hole: Reflections on social media as a portal into a distorted world of edited bodies and eating disorder risk and the role of algorithms, 41 Bᴏᴅʏ Iᴍᴀɢᴇ 292 (2022).

297 There are sources that raise this specific concern. E.g., Sapna Maheshwari, Young TikTok Users Quickly Encounter Problematic Posts, Researchers Say, N.Y. Tɪᴍᴇs (Dec. 14, 2022), https://www.nytimes.com/2022/12/14/business/tiktok-safety-teens-eating-disorders-self-harm.html [https://perma.cc/FU7G-HD4G] (“[TikTok] starts recommending content tied to eating disorders and self-harm to 13-year-olds within 30 minutes of their joining the platform, and sometimes in as little as three minutes … .”).

298 WSJ Staff, Inside TikTok’s Algorithm: A WSJ Video Investigation, Wᴀʟʟ Sᴛ. J. (July 21, 2021, 10:26 AM), https://www.wsj.com/articles/tiktok-algorithm-video-investigation-11626877477 [https://perma.cc/4CFJ-UK2S].

299 See Auditing Algorithms: The Existing Landscape, Role of Regulators and Future Outlook, Dɪɢɪᴛ. Cᴏᴏᴘ. F. (Sept. 23, 2022), https://www.gov.uk/government/publications/findings-from-the-drcf-algorithmic-processing-workstream-spring-2022/auditing-algorithms-the-existing-landscape-role-of-regulators-and-future-outlook#introduction-and-purpose (“Consumers or those affected by algorithmic systems who have a better understanding of these systems can then take informed decisions about how or when they engage with different products and services.”).

300 See id. (“Where the outputs of an algorithmic processing system have impacts on individuals, the system will be subject to regulatory expectations such as ensuring consumers or citizens are treated fairly, not discriminated against, and have their rights to privacy respected.”).

301 Id. (“Auditing can indicate to individuals that they have been harmed … . It can provide them with evidence that they could use to seek redress.”).

302 See Mɪʟᴇs Bʀᴜɴᴅᴀɢᴇ ᴇᴛ ᴀʟ., Tᴏᴡᴀʀᴅ Tʀᴜsᴛᴡᴏʀᴛʜʏ AI Dᴇᴠᴇʟᴏᴘᴍᴇɴᴛ: Mᴇᴄʜᴀɴɪsᴍs ғᴏʀ Sᴜᴘᴘᴏʀᴛɪɴɢ Vᴇʀɪғɪᴀʙʟᴇ Cʟᴀɪᴍs 11 (2020) (“Third party auditors can be given privileged and secured access to … private information, and they can be tasked with assessing whether safety, security, privacy, and fairness-related claims made by the AI developer are accurate.”).

303 See James Kobielus, How We’ll Conduct Algorithmic Audits in the New Economy, IɴғᴏʀᴍᴀᴛɪᴏɴWᴇᴇᴋ (Mar. 4, 2021), https://www.informationweek.com/ai-or-machine-learning/how-we-ll-conduct-algorithmic-audits-in-the-new-economy (arguing that audit scopes should be clearly and comprehensively stated in order to make clear what aspects of audited algorithms may have been excluded and why they were not addressed in a public report (e.g., to protect sensitive corporate intellectual property)).

304 See e.g, Balbuzanova, supra note 110; Universal City Studios, Inc., 273 F.3d at 429; Johnson Controls, 886 F.2d at 1173; Bernstein, 922 F.Supp. at 1426 (N.D. Cal. 1996); e-ventures Worldwide, LLC, 188 F. Supp. 3d at 1265; Zhang, 10 F. Supp. 3d at 433 (finding that computer codes and search engine outputs are protected speech under the First Amendment).