Industry’s Role in Promoting Kids’ Online Health, Safety, and Privacy: Recommended Practices for Industry
Helping Kids Thrive Online Health, Safety, & Privacy
There are many practices that developers of online platforms can implement to help protect youth online and enable them to thrive. The structures and functions of online platforms are the result of specific design choices, including, in many cases, choices to collect and use data about people for commercial purposes, maximize how much time people spend online, and target users with commercial and non-commercial content. These trends are concerning for all users but pose a distinct threat to youth.
Below are 10 important recommended practices that online service providers should take to develop platforms with youth well-being in mind.
1. Design age-appropriate experiences for youth users.
2. Make privacy protections for youth the default.
3. Reduce and remove features that encourage excessive or problematic use by youth.
4. Limit “likes”, and social comparison features for youth by default.
5. Develop and deploy mechanisms and strategies to counter child sexual exploitation and abuse.
6. Disclose accurate and comprehensive safety-related information about apps.
7. Improve systems to identify and address bias and discrimination that youth experience online.
9. Provide age-appropriate parental control tools that are easy to understand and use.
10. Make data accessible for verified, qualified, and independent research
Each section below provides a brief discussion of specific challenges to youth well-being as well as interventions (i.e., tools, techniques, features, and settings) that providers and developers can employ to mitigate health, safety, and privacy risks to youth while maximizing their beneficial use of online services. Many of these best practices and tools use the language of “minimization,” drawn from the data-protection concept of “data minimization,” or collecting and using no more data than is necessary to perform a specific function. Of note, certain online services already implement—or have started to offer—some or many of the tools mentioned below for users outside of the United States (e.g., in the United Kingdom and Australia), and implement some protections as defaults (e.g., privacy protections in California). For example, some companies started preventing unknown adults from messaging children and have removed video autoplay and nighttime notifications for youth accounts.192 Companies and services providing these protections to young people living elsewhere should offer those options to parents, caregivers, and youth across the United States.
1. DESIGN AGE-APPROPRIATE EXPERIENCES FOR YOUTH USERS
There is a huge diversity in the online platforms and services that young people use, and there is no one-size-fits-all approach to making platforms safer for kids. Thus, it is pivotal that platform operators design their services with kids’ health, safety, and privacy in mind. A platform operator should employ well-known design methodologies of human-centered design (HCD)193 and value-sensitive design (VSD)194 and consider the unique aspects of its technology (hardware and software), its users (including youth), and other relevant stakeholders (e.g., parents, caregivers, and educators) when developing its service and incorporating any interventions.
Cognitive factors (e.g., attention, information processing, reasoning, and decision-making) are particularly relevant for platforms to promote youth health, safety, and privacy. Research indicates that youth undergo key developmental milestones that help them acquire certain competencies, more developed comprehension, and agency.195 196 197For example, health experts recognize that a pivotal part of adolescent development is the need to exercise some autonomy.198 It can be challenging for young people to exercise autonomy in a digital ecosystem designed to influence their behavior. Likewise, technology may shape how some young people achieve key competencies and develop a sense of agency.199 200 201 202
Crucially, HCD principles should be coupled with values-based approaches to design, such as design processes that are rooted in a desire to promote the health, safety, and privacy of youth. Technology developers and providers should use such values-sensitive approaches to design products and services that prevent and mitigate harm while providing youth with the ability to make age-appropriate choices about the material and features to which they are exposed.
Recommended Practices
Creating age-appropriate experiences requires youth input, research, and continuous performance evaluation. Recommended practices include:
- Identify the types of youth who will be using the technology, including potential and unintended (e.g., prohibited) users. Consider the varied experiences of youth users across different demographics, developmental stages, needs, anticipated uses, home environment, and experience level.
- Take youth's different contexts into account. Identify features that may be beneficial or neutral in some contexts but harmful in others in product design and risk mitigation efforts (e.g., features designed to encourage daily use in an education app versus a gambling app).203
- Use methodologies for human-centered design and value-sensitive design to identify product features that support well-being.
- Design based on research and performance assessments (e.g., user experience testing) focused on youth’s experiences.
- Engage youth, parents and caregivers, relevant stakeholders, and experts from the field throughout the entire product development lifecycle—including young people from different backgrounds and age groups.
- esearch and evaluation partnerships that include youth (e.g., efforts with independent research groups, academia, and government).
- Continuously evaluate harm-prevention efforts to ensure their effectiveness across time, platform, and contextual changes.
2. MAKE PRIVACY PROTECTIONS FOR YOUTH THE DEFAULT
Gathering, using, and sharing data (including images) can pose risks to anyone, and these risks can be amplified in their potential to negatively affect youth. Certain commercial practices can potentially impact youth users by profiling or targeting them with malicious activity. For example, the risks include safety concerns from disclosures of information (whether intentional sharing or data breaches), as well as risks from third-party access to interactions with youth (whether companies or individuals). The Children’s Online Privacy Protection Act (COPPA),228 Family Educational Rights and Privacy Act (FERPA),229 and laws from states such as California,230 provide certain privacy protections for children. Certain platforms and services also operate under business models that do not rely upon the collection of personal data. However, existing laws and industry practices are insufficient to address all harms and ensure platforms support children’s well-being. In response, the Biden-Harris Administration has called upon Congress to enact better privacy protections.231, 232 But industry should not wait to implement stronger protections for children. Developers of online platforms and services can provide enhanced privacy protections for minors in a variety of ways.
Recommended Practices
- Strictly limit the collection of minors’ data and personal information.
- Do not condition a minor’s use of the platform on collection of personal information or disclosure of personal information to third parties.
- Collect only the personal information that is reasonably necessary for a minor to participate in the specific offerings of the service (e.g., game, prize, or activity), as is already required in certain circumstances by federal law under COPPA.233 234
- Retain and use personal information of minors only as long as necessary to fulfill the purpose for which it was collected, as is already required in certain circumstances by federal law under COPPA.235
- Establish, and make public, a written data retention policy for minors’ personal information that minimizes the retention period.236 COPPA requirements on data retention and deletion can be a good model even for services outside its scope.
- Minimize or disable platform and service collection of geolocation and biometric information.
- Enable data portability and interoperability tools that allow young users to easily switch to platforms that best fit their needs. Such tools should support the user’s ability to maintain their social graph.
- Make minors’ accounts private by default.
- Automatically implement the strongest available privacy settings.237 238
- Turn off direct messaging for minors by default but allow teens to opt into this feature.
- Make accounts and personal data easy to delete.239 240 241
- Avoid recommending connections between minors’ accounts and other users’ accounts.242 243 244
- Allow other users to connect with minors only after obtaining consent or specific information from the minor’s account
- Prevent sharing of minors’ data by default
- Disable sharing of minors' precise geolocation or biometric data, except when necessary (e.g., with a caregiver). Alternatively, turn off sharing of minors’ location data by default, and permit sharing only with consent.
- Ensure that the visibility of children’s posts is limited by default—that is, content that children post should only be shared with and visible to contacts and friends, unless the child who posted affirmatively chooses otherwise.
- Do not enable targeted advertisements or personalized algorithmic recommendations of content by default for minors.
- Do not enable targeted advertising to minors based on their activities, whether on or off the platform.245 246 247
- Ensure any remaining non-targeted advertising that appears alongside content intended for a young audience is also age appropriate.
- Do not enable by default content recommendation algorithms that are based on personalized profiling of a minor.
3. REDUCE AND REMOVE FEATURES THAT ENCOURAGE EXCESSIVE OR PROBLEMATIC USE BY YOUTH
Any digital technology commonly accessed by minors should incorporate intentional design features that promote health and well-being and limit features that maximize time, attention, and engagement.248 Industry can provide minors and their parents or caregivers with tools to choose features based upon their needs and opt out of those that provide no benefit or pose risks. Software designers and product developers can take measures to help ensure that their products reduce risk of harm, benefit adolescents and support youth development and well-being.
Recommended Practices
- Avoid unnecessary notifications and engagement-driven nudges
- During the design process, consider the necessity of notifications and the disruption they cause for youth’s sleep and attention.249 250
- Mute notifications during certain times of the day (e.g., school and sleep hours), or avoid them altogether, except for key safety and medical notifications or communication channels.251 252
- Avoid the use of infinite scroll and autoplay features that load content continuously as a user scrolls.
- Consider populating feeds chronologically by default, or other measures to replace infinite scroll, which places an emphasis on prolonged engagement.253
- Provide minors with controls that allow them to adjust what is presented to them in their feeds and to use timers to limit total daily usage.
- Disable any features that automatically continue to play content (e.g., videos) by default.254
- Deploy default settings that include auto-shutoff and do-not-disturb features or minimize blue light at certain times of day and night.255
- Provide teens and parents or caregivers of younger children the ability to change minors’ default settings to more protective settings.256 257
- Minimize addictive features of mobile games and other services.
- Minimize the use of random action-based rewards (e.g., loot boxes),258 especially those that require a form of payment.
- Minimize the use of limited time offers and other incentives and disincentives that lure people into playing for longer due to fear of missing out, loss of a reward, or other impacts to a minor’s gameplay.
- Minimize ephemeral content and contingent rewards
- Set longer time limits for stories or ephemeral content, to reduce the sense of urgency that keeps users unnecessarily engaged for fear of missing out. Alternatively, make them available until a user accesses them within a longer set time period. For example, rather than setting a cap of 24 hours, change to 48 hours or until someone views them, with a specific cap such as 48 or 72 hours.
- Eliminate rewards (e.g., interaction streaks) that require minors to keep coming back to the app or staying on the app to reach a special status or to get some type of reward.
- Stop use of dark patterns aimed at increasing minors’ time online.
- Remove specific design features that make it hard for minors to exercise options to get out of a content stream or back to the home screen without viewing more content (e.g., hiding the “X” button or making it hard to go back to the prior screen).
4. LIMIT “LIKES” AND SOCIAL COMPARISON FEATURES FOR YOUTH BY DEFAULT
Interaction mechanisms and features can over-engage minors in areas that are known to be unhealthy, such as social comparison features.259 As with other features, in different contexts these can have either beneficial or negative effects on kids.260 Networking and reaction features that encourage increasing the number of contacts a minor has on a social network and that convey quickly that something has been viewed and liked can have a negative effect if used by youth to gauge the popularity of themselves and their posts.261 These features can also encourage young people to expand their connections to strangers and nonfriends, potentially increasing their risk of negative or dangerous interactions.262 263
Recommended Practices
- Minimize quantifying "likes" and social comparison features.
- Hide by default the visibility of the number of connections, friends, or followers for a minor’s account or piece of content.264
- Cap or remove by default—as appropriate for age—likes and related emojis, views, dislikes, or other interactions for a minor’s posts and others’ posts that the minor views.265
- Minors should be able to easily disable comments on their own posts. Settings should be defaulted to only allow friends and contacts to comment on children’s posts
5. DEVELOP AND DEPLOY MECHANISMS AND STRATEGIES TO COUNTER CHILD SEXUAL EXPLOITATION AND ABUSE
Online sexual exploitation and abuse can occur on a myriad of platforms, including social media, gaming systems, and chat apps, and young people may be targeted both by adult strangers or someone they know and trust. This exploitation often takes advantage of common features of online platforms, including direct messaging, the ability to share photos, and the ability to identify and connect with groups and networks of other users. Designers of online platforms should evaluate their services through the lens of potential abuse by users who plan to engage in child sexual exploitation and abuse (CSEA); this will help identify risks present in the design and operation of their service and implement safety-by-design safeguards against CSEA. Designers of online platforms should also evaluate their services from the perspective of young people who may be targeted for exploitation and abuse in order to ensure the availability of resources and interventions that can interrupt pathways to abuse.
Recommended Practices
- Join the National Center for Missing and Exploited Children’s (NCMEC) “Take it Down” initiative that helps children anonymously seek the removal of sexually explicit material of themselves from online platforms.266
- Develop and enforce policies making clear that child sexual exploitation and abuse in any form, including AI-generated images of children, is prohibited.
- Provide detailed and timely reports to the NCMEC’s CyberTipline.267
- Implement measures to detect and respond to grooming language.
- Explore ways to identify and disrupt problematic interactions between adults and minors, or between minors, online.
- Incorporate pop-up messages or other “friction points” to warn children that they might be encountering or becoming involved in a dangerous situation, or to warn would-be offenders that they are about to engage in illegal conduct.
- Respond promptly to user-submitted reports of online child sexual exploitation and abuse.
- Incorporate other technical measures to reduce child sexual exploitation and abuse.
- Incorporate screenshot and screen-recording prevention features using existing operating system-provided tools.
- Automatically remove hidden data from shared content.
- Do not automatically download or display shared content.
- Build user interface measures that create hurdles (i.e., friction) to limit the easy sharing and re-sharing of sensitive content.
- De-index child sexual abuse material (CSAM), and remove links to websites and services that are known to carry CSAM.
- Display warnings before sensitive content is shared using on-device and privacy-preserving detection methods.
- Provide prompt and thorough responses to legal process from law enforcement.268
- Share information with other online platforms about users who have engaged in child sexual exploitation, in a privacy-protective manner—including through privacy-enhancing technologies—to ensure information is only shared as necessary to identify those specific potential threats to youth safety.269
- Develop and implement mechanisms to detect and disrupt sextortion schemes.
- Invest in education and prevention efforts to help children learn ways to stay safe online and parents and caregivers identify risks or indicators of online sexual exploitation, such as DHS' Know2Protect campaign.270
- Implement best-practice mitigation measures throughout the AI lifecycle (such as red-teaming271 and ensuring that models are not trained on CSAM) to minimize the ability for AI tools to be used to generate child sexual exploitation and abuse.
- Develop and use mechanisms to monitor and stop live-streaming videos showing child sexual exploitation and abuse, while safeguarding privacy.272
6. DISCLOSE ACCURATE AND COMPREHENSIVE SAFETY-RELATED INFORMATION ABOUT APPS
Millions of apps are available for download on app stores,273 typically accompanied by a rating and other information including features, reviews, pictures, and privacy information. Descriptions in app stores could provide valuable information to parents and children about potential safety, and risks that a particular app may pose to a child. But safety-related information is often lacking in meaningful detail, if it is there at all, and may therefore be misleading. For example, it would be useful for caregivers and youth to know if an app allows for users to be contacted by strangers. Although apps are typically assigned age-ratings (e.g., 4+, 12+, etc.), those ratings do not always align with parents’ needs or expectations and often do not correspond with stages of children’s social and emotional development. These ratings could provide a false sense of assurance or safety (including the privacy of information about the user). App developers and app stores should focus on improving the accuracy and consistency of app rating and labeling and work with experts in child social and emotional development to determine thresholds for appropriate age ratings.
Recommended Practices
- Provide detailed safety information in descriptions of apps in app stores, including, for example:
- Informing parents of the possibility of communication between adults and children on the app.
- Developing and informing parents about age verification tools built into the app or available at the device level.
- Maximizing protections on devices belonging to minors and making use of operating system provider functionality linking minors to family accounts with adjustable permissions.274
- Informing users whether communication on the app between users is monitored.
7. IMPROVE SYSTEMS TO IDENTIFY AND ADDRESS BIAS AND DISCRIMINATION THAT YOUTH EXPERIENCE ONLINE
Young people may experience bias and discrimination online both in their interactions with other users and in the ways in which their data is collected and used by online platforms. Young people who belong to racial, ethnic, religious, gender identity, gender expression, sexual orientation, disability status, or other marginalized identities may be exposed to hateful and harassing comments from other users that target them based on those identities. They also experience abusive conduct, such as “Internet banging,” swatting, stalking, trolling, or “griefing.”275 276 Developers of online platforms should understand the particular risks of discriminatory interactions faced by young users based on their different demographic backgrounds and develop mitigation and intervention measures. Platform developers should also evaluate their own data collection and use practices for bias to minimize the risk of biased and discriminatory treatment of youth users.
Recommended Practices
- Deploy and improve the use of manual and automated moderation of discriminatory content and activity.
- Train moderators in the different ways in which discrimination is experienced by young people online, including the use of reclaimed words in non-pejorative contexts, and the use of words in languages other than English that convey hate based on cultural context.277
- Evaluate the operation and impact of automated content moderation tools, specifically for bias and discrimination, including across multiple languages.278
- Prioritize moderation systems for features that pose the highest risks of discriminatory conduct against young people (e.g., voice chat in gaming).279
- Establish and meaningfully uphold anti-discrimination community standards and codes of conduct and enforce platform terms of service that prohibit users from engaging in threatening or abusive behaviors towards marginalized communities.
- Evaluate the outputs of automated content-generation systems such as autocomplete and generative AI tools for bias.
8. USE DATA-DRIVEN METHODS TO DETECT AND PREVENT CYBER-BULLYING AND OTHER FORMS OF ONLINE HARASSMENT AND ABUSE
Various data-driven methods and models are used across online spaces and platforms to detect cyberbullying.280 While different methods and models are best suited for different types of content and platforms,281 challenges persist in the consistent identification of cyberbullying across social media spaces.282
Some youth experience bullying more than others. In fact, stigma plays a role in groups expressly targeted for bullying (e.g., LGBTQI+ individuals, persons with disabilities, and persons who are overweight/obese) and the type of bullying they faced.283 Among U.S. high school students surveyed in a 2021 report, online bullying victimization is higher among females, White and American Indian and Alaska Native (AI/AN) youth, and youth who identify as a sexual minority.284 Additionally, according to ED's Office of Civil Rights, students with disabilities in public schools reported being harassed or bullied at rates higher than their representation in the total school enrollment.285 286 287 The accuracy of cyberbullying detection models can be affected by cultural differences.288 Inclusion in the training of machine learning classifiers of additional contextual information from users’ posts on a service (such as a given user’s history of posting comments with greater than average amounts of profanity, or higher usage of pronouns indicating more messages directed at other users) could help improve the accuracy of online bullying detection.289
It is important that safety efforts take into account that sextortion and the non-consensual sharing or threatening to disseminate sexual images are also perpetrated by peers, including by current or former dating partners.290 This form of abuse is highly gendered, with girls the majority of victims targeted predominantly by boys. Other forms of technology-facilitated abuse, such as cyberstalking, can take place in the context of dating violence, with different considerations and implications for youth safety.
Recommended Practices
- Implement designs that help prevent, minimize, and mitigate bullying and other forms of online harassment and abuse, including, for example:
- luate technical interventions aimed at minimizing online bullying, such as reporting and bystander support tools,291 292 293 for their effects on both youth who are being bullied and those who are bullying others.
- Use diverse, tailored approaches to help protect users from being bullied, prevent users from bullying others, and empower bystanders to stand up to bullying (e.g., by providing tools that allow users to reach out to trusted friends when they are experiencing bullying).294
- Share evidence-based bullying prevention resources with parents/caregivers and youth — e.g., on cyberbullying tactics;295 preventing cyberbullying;296 social media, apps, and sites commonly used by children and teens;297 and cyberbullying and online gaming.298 (For a comprehensive list of resources, visit StopBullying.)
- Direct young people expressing self-harm or suicide-related behaviors to the 988 Suicide and Crisis Lifeline.299
- Promote access to youth-specific resources for image-based sexual abuse and dating violence, such as Love is Respect and NCMEC's Take it Down.300
- Allow muting/blocking of problematic users, even if the behavior does not rise to the level of violating platform policies.
- To reduce the risk of image-based sexual abuse among peers, employ image-blurring technology so that users only view images they consent to receive.
- Develop a classification framework for incorporating cultural differences to identify indicators of online bullying victimization.301
- Ensure equitable access to online safety resources and mechanisms for diverse user audiences (e.g., consider literacy levels, accessibility across different devices, and languages).
- Identify specific groups that have experienced more bullying to develop tailored interventions to address cyberbullying.302 303
- Embed online civility norms across online spaces (e.g., onboarding, policies, design, monitoring, and resources).304
- Deploy and improve the use of manual and automated moderation of bullying content and activity.
9. PROVIDE AGE-APPROPRIATE PARENTAL CONTROL TOOLS THAT ARE EASY TO UNDERSTAND AND USE
Some parents and guardians would like to exercise more control over their children’s online experiences.305 Well-designed parental control tools can help accomplish this, while also preserving the benefits of Internet use for youth.306 While parental controls are not a panacea for kids’ online safety, industry can do more to create parental controls that work for parents and kids. The relationship between different layers of available parental controls (e.g., on a device versus within an app or on a website) is often not clear to parents, and controls are not equivalent on different platforms and services.307 It is also important to note that some forms of parental controls may be invasive to young people’s privacy and harmful to already vulnerable youth; platform designers should consider carefully what kinds of controls, for what age group, are appropriate for parent accounts and which should be controlled by the user directly.308 A one-size-fits-all approach to parental controls may not be appropriate for many families.309 310 311 More research is necessary to understand when, where, and which parental controls are most effective, but some steps can help parents and guardians today.
Recommended Practices
- Adopt age-appropriate parental control solutions and promote their availability, which may differ by age-group and could include:
- Supervised accounts with appropriate limitations for certain age groups.312
- Limits on interactive functions such as chat.313 314
- Limiting and blocking contacts.315 316
- Limits on monetary spending.317
- Time limits or scheduled breaks.318
- Labels for manipulated content.319
- Easy account deletion.320
- Limit practices that encourage youth to circumvent parental control features.321
- Limits for children to receive in-app financial transactions from adults.
- Make parental control tools easy to understand and use.
- Make disclosures that adequately inform parents and caregivers about platforms and services322 323 so they understand the risks when they allow their children access to content.324
- Help parents understand the content and features they are enabling children to access if they help children to circumvent age restrictions.
- Consider developing “parental onboarding” resources to help parents and caregivers understand the available features and control tools.
- Make it easy for parents to change/configure monitoring and controls as their children age.
10. MAKE DATA ACCESSIBLE FOR VERIFIED, QUALIFIED, AND INDEPENDENT RESEARCH
A growing body of research has examined the relationship between children and online platforms with data that is available. However, researchers would significantly benefit from access to detailed platform data, which would expand and enable new areas of critical research and address the gaps in our understanding. 325 Barriers exist, for example, for academic data science research in the new realm of behavior modification by digital platforms.326 327 Platforms can provide tiered access to detailed online platform data under a framework that considers users’ expectations of privacy.328 Such access must be available in user-friendly formats without significant cost. Data access should protect users’ privacy and respond to researchers’ changing needs through context-specific consideration of the risks that different research efforts may pose. Platforms should be transparent about how users' data may be shared with researchers and aim to use privacy-enhancing technologies when possible and for the most sensitive data.
Online platform data is inherently networked and global. Youth can follow and comment on accounts from around the world and high-quality data sets will often require information on data subjects from other countries. We must collaborate with partners to respect privacy, data protection, and ethics laws that may vary. Specifically, the United States and the European Union have a shared commitment to advance data access for researchers—and this commitment has led to shared principles and transatlantic stakeholder discussions as part of the EU-U.S. Trade and Technology Council (TTC).329 Since the launch of the TTC, the EU Digital Services Act (DSA) has gone into effect, requiring providers of Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) to provide increased transparency into the operation of their services. The DSA includes specific mandates for VLOPs/ VLOSEs to share data with academic and civil society researchers in a way that is consistent with privacy protections and research ethics. The new law has encouraged platforms to launch new programs and create infrastructures that have the potential to benefit researchers globally. During the 6th Ministerial Meeting of the TTC in April 2024, Working Group 5 released a Status Report: Mechanisms for Researcher Access to Online Platform Data. 330 As policymakers aim to understand the impact of the online information environment on youth development, it will be important to follow Europe’s progress and offer guidance to platforms to extend access to the U.S. research community.
Recommended Practices
- Platforms should make the following types of data available to vetted researchers, subject to privacy protections:
- Usage data, such as the number of minor users and their time spent online, including how certain features and designs increase or decrease time spent.
- Aggregate information about individuals’ social network connections.
- Interaction with specific content and features of concern.
- Privacy and account settings chosen by users.
- Moderation-related data, such as user reporting, moderator decisions by type, and rate of takedowns.
- Targeted advertising that may reach children, including what data is used for ad targeting and what steps are taken to keep ads from being targeted to minors.
- Use of algorithmic and process-based recommendation systems, including actual personalized recommender systems where appropriate and the data used in the systems.
- Platforms should also support features that allow for data donation through comprehensive, machine-readable downloads and secure software designed for data collection through a smartphone app or browser.
192 For more on this topic, see, for example, Comment from 5 Rights Foundation to NTIA KOHS RFC at 4 (2023).
193 According to International Organization for Standardization (ISO) standard 9241-210:2019 ISO 9241- 210, human-centered design is “an approach to interactive systems development that aims to make systems usable and useful by focusing on the users, their needs and requirements, and by applying human factors/ ergonomics, and usability knowledge and techniques.” ISO 9241-210:2019 Ergonomics of human-system interaction — Part 210: Human-centered design for interactive systems.
194 B. Friedman, D. Hendry, and A. Borning, "A survey of value sensitive design methods. Foundations and Trends® in Human–Computer Interaction," 11(2), (2017): 63-125.; Knobel, C., & Bowker, G. C. (2011). Values in design. Communications of the ACM, 54(7), 26-28.
195 General comment No. 25 (2021) on children’s rights in relation to the digital environment | OHCHR, paras 19-21, p4; see, also, Digital Childhood – Addressing Childhood Development Milestones in the Digital Environment. Baroness Beeban Kidron, Founder 5Rights, and Dr. Angharad Rudkin, University of Southampton. Dec. 2017 (“In broad terms, childhood development moves from a state of high dependency on carers for security and guidance (infancy to 5 years), towards a move to school that increases independence and self-care (6-11 years), through to adolescence which is a time of increasing autonomy and growing reliance on peers for approval and support (12-18 years) and the final step in the move towards fully independent adult living (18-25). ”).
196 Comment on NTIA KOHS RFC, American Academy of Pediatrics (noting also differences by gender…. ).
197 In the typical sextortion scheme, the offender (i) makes contact with the targeted minor on social media and pretends to be a peer of the targeted minor; (ii) persuades the targeted minor to send sexually explicit images or videos of him- or herself to the offender (or creates such images himself through the use of AI); and then (iii) threatens to widely distribute the sexually explicit images and videos of the targeted minor (for example, to the minor’s parents, coaches, religious leaders, school, etc.) unless the minor sends money or additional imagery to the offender.
198 J. Nesi, E. Telzer, and M. Prinstein, “Adolescent Development in the Digital Media Context,” Psychological Inquiry 31, no. 3 (2020): 230.
199 General comment No. 25 (2021) on children’s rights in relation to the digital environment | OHCHR, paras 19-21, p4.
200 Digital Childhood – Addressing Childhood Development Milestones in the Digital Environment. Baroness Beeban Kidron, Founder 5Rights, and Dr. Angharad Rudkin, University of Southampton. Dec. 2017 (“In broad terms, childhood development moves from a state of high dependency on carers for security and guidance (infancy to 5 years), towards a move to school that increases independence and self-care (6-11 years), through to adolescence which is a time of increasing autonomy and growing reliance on peers for approval and support (12-18 years) and the final step in the move towards fully independent adult living (18-25). ”).
201 Comment on NTIA KOHS RFC, American Academy of Pediatrics, Nov. 30, 2023 (noting also differences by gender…. ).
202 Comment on NTIA KOHS RFC , BBB National Programs at 5-6, Nov. 30, 2023 (noting needs of 17, 13 and 7 year olds differ).
203 Comment from 5 Rights to NTIA KOHS RFC at 13, 2023 (“Some of the most commonly used EdTech products are commercially provided, highly data extractive and use the same persuasive design strategies found on social media, such as gamification and personalization”).
204 Services that are directed to minors may have specific obligations under state law. Services that are directed to children under 13 must obtain verified parental consent before collecting personal information from their users, as required by the Children’s Online Privacy Protection Act.
205 The Australian eSafety Commission, which has studied the topic particularly with regard to limiting minors’ access to pornography, concluded in early 2023 that mandating age assurance methods was not yet possible as “each type of age verification or age assurance technology comes with its own privacy, security, effectiveness and implementation issues.” (Australia) “Roadmap for age verification and complementary measures to prevent and mitigate harms to children from online pornography, eSafety (March 2023) (concluding “that a decision to mandate age assurance is not ready to be taken.”); See, also, Online age verification: balancing privacy and the protection of minors, French CNIL (Sep. 22, 2022).
206 E.g., comment of ITI to NTIA Kids Online Health and Safety (KOHS) Request for Comment (RFC) (these come with tradeoffs, involving more data collection/use or restricted access for adults so age assurance should be risk-based and not mandated for all, citing to UK AADC principles, providing guidance and flexibility for companies).
207 E.g., comment on NTIA KOHS RFC, Microsoft, Nov. 30, 2023 (noting legislation should allow for “riskbased, proportionate application of age assurance mechanisms”).
208 E.g., comment on NTIA KOHS RFC, TechNet, Nov. 30, 2023.
209 E.g., comment on NTIA KOHS RFC, R Street Nov. 30, 2023 (noting age assurance makes sense for pornography and gambling sites, but not for general-use platforms).
210 See, e.g., Measurement of Age Assurance Technologies | DRCF and Home - Age Check Certification Scheme (accscheme.com).
211 See, e.g., Families’ attitudes towards age assurance, Research commissioned by the United Kingdom’s Information Commissioner’s Office and Ofcom (Oct. 11, 2022), at 19, (cited in the Children’s Online…. Rulemaking doc, FTC, N177). In addition, parents and guardians sometimes assist in setting up adult accounts for minors, as a way to exercise their rights to decide where their kids can go online. See, e.g., (UK) Families Attitudes Towards Age Assurance, Digital Regulation Cooperative Forum, 2021 (research commissioned by the ICO and Ofcom), pp6-7.
212 Face Analysis Technology Evaluation (FATE) Age Estimation & Verification, NIST (May 2024).
213 Comment on NTIA KOHS RFC, Match to NTIA KOHS RFC (describing how it uses additional sources).
214 See, e.g., Comment on NTIA KOHS RFC, Access Now, (privacy, surveillance, disproportionate risk to LGBTQ youth).
215 See comment on NTIA KOHS RFC from CDT (privacy concerns).
216 See comment on NTIA KOHS RFC from CCIA (citing CNIL (French) study of age-verification solutions).
217 See comment on NTIA KOHS RFC from Future of Privacy Forum (privacy risks, limits legitimate access to services).
218See comment on NTIA KOHS RFC from Huddleston (requiring age verification “will result in companies having access to IDs or biometrics of all [I]nternet users”).
219 See comment on NTIA KOHS RFC from NetChoice and Comment from Taxpayers Protection Alliance (saying age and users ID can’t be separated).
220 See comment on NTIA KOHS RFC from Public Knowledge (significant privacy risks, including loss of anonymity).
221 See, e.g., Centre for Information Policy Leadership, Protecting Children’s Data Privacy POLICY PAPER I, International Issues And Compliance Challenges at 31 (Oct. 20, 2022) (filed as a comment to NTIA).
222 International Organization for Standardization/International Electrotechnical Commission, ISO/IEC 27566-1 (under development) Information security, cybersecurity and privacy protection — Age assurance systems — Framework — Part 1: Framework.
223 Digital Trust & Safety Partnership, Age Assurance: Guiding Principles and Best Practices (Sept. 2023).
224 We use “age assurance” to refer to the range of techniques that can be used to estimate or verify the age of an individual online.
225 The Australian eSafety Commission, which has studied the topic particularly with regard to limiting minors’ access to pornography, concluded in early 2023 that mandating age assurance methods was not yet possible as “each type of age verification or age assurance technology comes with its own privacy, security, effectiveness and implementation issues.” (Australia) Roadmap for age verification and complementary measures to prevent and mitigate harms to children from online pornography, eSafety (March 2023) (concluding “that a decision to mandate age assurance is not ready to be taken.”).
226 See Online age verification: balancing privacy and the protection of minors, French CNIL (Sep. 22, 2022).
227 For example, UK Information Commissioner’s Office, Age Assurance Opinion, (Jan. 18, 2014) (3. Age assurance methods). See also Ofcom, “Consultation: Guidance for service providers publishing pornographic content (Dec. 5, 2023), 4.12-13 (Noting it does “not have sufficient evidence as to the effectiveness and potential risks of different age assurance methods to recommend specific metrics for assessing whether or not any given age assurance method or process should be considered highly effective” and that “it would not be appropriate at this time to set a base level or score which service providers must ensure their age assurance method or process meets for each of the criteria.”).
228 COPPA applies to data about children under 13. It includes collection, use, and retention limitations, as well as data security requirements. Use limits prevent children’s data collected with permission for one purpose from ending up as part of some other score, algorithm, profile, or for another commercial use (for instance A/B testing or supposedly “anonymous” or “aggregate” audience models that parents did not consent to). COPPA also includes retention limits to prevent companies, once they collect a child’s data, from keeping it for speculative future uses. The COPPA Rule is currently under review to ensure it meets the needs of modern technological advances. See FTC Proposes Strengthening Children’s Privacy Rule to Further Limit Companies’ Ability to Monetize Children’s Data.
229 The Family Educational Rights and Privacy Act (FERPA) (20 U.S.C. § 1232g; 34 CFR Part 99) is a Federal law that protects the privacy of student education records. FERPA applies here to the extent that the technology is an online educational service provided by a third-party or school district and used as a part of a school activity. For more information about FERPA, please visit the U.S. Department of Education’s Student Privacy Policy Office’s Web site.
230 CA Civ Code § 1798.99.28-40 (2022) (currently under preliminary injunction, NetChoice v. Bonta, Case No. 22-cv-08861-BLF, N.D. Cal., Sep. 18, 2023, on appeal in Bonta v. NetChoice, Case No. 23-2969, 9th Cir., oral arguments scheduled for Jul. 17, 2024).
231 The White House, "State of the Union," (2024).
232 The White House, "Readout of White House Listening Session on Tech Platform Accountability," (Sept. 08, 2022).
233 COPPA NPR, 89 Fed. Reg. at 2059-62.
234 The Children’s Online Privacy Protection Act prohibits collecting more personal information from a child than is reasonably necessary for a child to participate in a game, offering of a prize, or another activity. See 15 U.S.C. § 6502(b)(1)(c); 16 C.F.R. § 312.7; COPPA NPR, 89 Fed. Reg. at 2059-60.
235 The COPPA Rule specifically states that operators should retain personal information collected online from a child for only as long as is reasonably necessary to fulfill the purpose for which the information was collected. See 16 C.F.R. § 312.10.
236 COPPA NPRM, 89 Fed. Reg. at 2062 (proposing these measures).
237 Epic Games’ settlement with the FTC in 2022 requires the company to have strong privacy default settings for children and teen users of products, including Fortnite, ensuring that voice and text communications are turned off by default. See Fed. Trade Comm’n, Fortnite Video Game Maker Epic Games to Pay More Than Half a Billion Dollars over FTC Allegations of Privacy Violations and Unwanted Charges (Dec. 19, 2022).
238 See also Children’s Online Privacy Protection Rule Notice of Proposed Rulemaking, 89 Fed. Reg. 2034 (Jan. 11, 2024) [hereinafter COPPA NPR] (The FTC’s recent proposed changes to the COPPA Rule include a proposal making more rigorous the ban on sharing children’s information with third parties by default and making more explicit within the Rule that operators are prohibited from conditioning access to the service on the parent agreeing to the sharing of information.).
239 See Complaint, United States v. Amazon.com, Inc., 2:23-cv-811 (W.D. Wash 2023), available at (allegation that data deletion requests were not honored due to undisclosed retention of transcripts of children’s voice recordings).
240 See also Complaint, In re: Epic Games, Inc., FTC Docket No. C-4790 (Mar. 14, 2023) at (alleging that operators failed to delete children’s personal information at parents’ requests).
241 Design it For Us Policy Platform, Section V(D).
242 U.S. Surgeon General, "Social Media and Youth Mental Health: The U.S. Surgeon General’s Advisory," 16 (2023) (“Surgeon General’s Advisory”) (“Ensure default settings for children are set to highest safety and privacy standards.”).
243 Center for Digital Democracy, Fairplay, et al., "Petition for Rulemaking to Prohibit the Use on Children of Design Features that Maximize for Engagement" (“Fairplay/CDD Petition”), 67 (Nov. 17, 2022).
244 See also US v. Epic Games, Inc. (E.D.N.C. 2023) (FTC settlement in which Epic Games agreed to set minors’ strong privacy settings for minors using Fortnite at their highest levels by default).
245 Design It For Us recommends eliminating targeted advertising to minors and young adults. See Design It For Us Policy Platform, Section II(A). The FTC has brought numerous cases against companies that collected from children, or caused to be collected from children on their behalf, persistent identifiers and/ or other personal information and used such information to deliver targeted ads without complying with the Children’s Online Privacy Protection Act’s verifiable parental consent requirement.
246 See, e.g., Complaint, US v. OpenX Techs., Inc., 2:21-cv-09693 (C.D. Cal. Dec. 15, 2021); Complaint, US v. Hyperbeard, Inc., 3:20-cv-03683 (N.D. Cal. June 3, 2020); Complaint, FTC and State of NY v. Google, LLC and YouTube, LLC, 1:19-cv-02642 (D.D.C. Sept. 4, 2019).
247 See also Complaint, US v. Edmodo, LLC, 3:23-cv-02495 (N.D. Cal. May 23, 2023) (alleging that ed tech provider violated COPPA by failing to obtain verifiable parental consent for collection and use of persistent identifiers for contextual advertising)., and engagement.
248 J. Pfeifer et. al, "What Science Tells Us About How to Promote Positive Development and Decrease Risk in Online Spaces for Early Adolescents," UCLA.
249 An August 2023 European Union (EU) Parliament Committee Report on addictive design recommends turning off all notifications by default. European Parliament Committee on the Internal Market and Consumer Protection, Report on Addictive Design of Online Services and Consumer Protection in the EU Single Market (“EU Report”) (Aug. 23, 2023). American Federation of Teachers (“AFT”) President Randi Weingarten also suggested at an October 2023 Safe Tech, Safe Kids conference that notifications could be muted while children are in school. Alternatively, lower-priority notifications might be muted by default (e.g., passive or less direct interactions such as reactive emojis to the minor’s post). or even notifications unrelated to the minor’s own such as posts by, accounts the minor follows, or a notification that the minor has not interacted with a certain user in a while).
250 A. Hern, "TikTok acts on teen safety with ‘bedtime’ block on app alerts," The Guardian (Aug. 12, 2021).
251 An August 2023 European Union (EU) Parliament Committee Report on addictive design recommends turning off all notifications by default. European Parliament Committee on the Internal Market and Consumer Protection, Report on Addictive Design of Online Services and Consumer Protection in the EU Single Market (“EU Report”) (Aug. 23, 2023). American Federation of Teachers (“AFT”) President Randi Weingarten also suggested at an October 2023 Safe Tech, Safe Kids conference that notifications could be muted while children are in school. Alternatively, lower-priority notifications might be muted by default (e.g., notifications about passive or less direct interactions like reactive emojis to the minor’s post, or notifications unrelated to the minor’s own posts, such as posts by accounts the minor follows, or a notification that the minor has not interacted with a certain user in a while).
252 A. Hern, "TikTok acts on teen safety with ‘bedtime’ block on app alerts," The Guardian (Aug. 12, 2021).
253 C. Odgers, N. Allen, J. Pfeifer, R. Dahl, J. Nesi, S. Schueller, J. Williams, and the National Scientific Council on Adolescence, "Engaging, safe, and evidence-based: What science tells us about how to promote positive development and decrease risk in online spaces," Council Report No 2, (2022). The FTC recently proposed amendments that would clarify that an operator’s online notice must indicate the use of children’s personal information to encourage or prompt use of the operator’s website or online service, such as through a push notification (89 Fed. Reg. at 2050) and to exclude from the internal operations and multiple contacts exceptions to COPPA’s VPC requirement the use of persistent identifiers to encourage or prompt use of a website or online service. (89 Fed. Reg. at 2053. 2059, 2074).
254 Yolanda Reid et al., "Media and Young Minds," Pediatrics Vol. 138 , iss. 5.
255 "Blue Blocker Glasses as a Countermeasure for Alerting Effects of Evening Light-Emitting Diode Screen Exposure in Male Teenagers," at 114 (explaining that excessive blue light exposure can negatively affect circadian physiology in male youth).
256 An August 2023 European Union (EU) Parliament Committee Report on addictive design recommends turning off all notifications by default. European Parliament Committee on the Internal Market and Consumer Protection, Report on Addictive Design of Online Services and Consumer Protection in the EU Single Market (“EU Report”) (Aug. 23, 2023). American Federation of Teachers (“AFT”) President Randi Weingarten also suggested at an October 2023 Safe Tech, Safe Kids conference that notifications could be muted while children are in school. Alternatively, lower-priority notifications might be muted by default (e.g., passive or less direct interactions such as reactive emojis to the minor’s post). or even notifications unrelated to the minor’s own such as posts by, accounts the minor follows, or a notification that the minor has not interacted with a certain user in a while).
257 Alex Hern, "TikTok acts on teen safety with ‘bedtime’ block on app alerts," The Guardian (Aug. 12, 2021).
258 FTC Video Game Loot Box Workshop, Staff Perspective (Aug. 2020) at 1, (“Broadly speaking, a loot box is a video game microtransaction in which the consumer purchases a reward containing one or more virtual items of differing value or rarity assigned at random.”).
259 D. Yeager, R. Dahl, and C. Dweck, "Why interventions to influence adolescent behavior often fail but could succeed," Perspectives on Psychological Science : a Journal of the Association for Psychological Science 13, no. 1, (2018): 101-122; see also J. Nesi, E. Telzer and M. Prinstein, “Adolescent Development in the Digital Media Context,” Psychological Inquiry 31, no. 3 (2020): 230.
260 "Facebook Knows Instagram Is Toxic for Teen Girls, Company Documents Show," The Wall Street Journal, accessed September 12, 2022.
261 E.g., comment of APA to NTIA KOHS RFC at 5 ("platforms are more apt to motivate users towards one’s metrics than people themselves, which has led many youths to upload curated or filtered content to portray themselves most favorably. Note that these features of social media, and the resulting behaviors of those who use social media create the exact opposite qualities needed for successful and adaptive relationships (i.e., disingenuous, anonymous, depersonalized). In other words, social media offers the 'empty calories of social interaction,' that appear to help satiate our biological and psychological needs, but do not contain any of the healthy ingredients necessary to reap benefits.").
262 See In Re: Social Media Adolescent Addiction/Personal Injury Products Liability Litigation, at 135.
263 Courts and Tribunals Judiciary, Molly Russell: Prevention of Future Deaths Report, (13 October 2022)
264 Fairplay/CDD Petition, at 66-67
265 Id. Design it For Us generally supports eliminating or hiding by default public displays of engagement and engagement metrics, such as likes, views, etc. Design it For Us, Policy Platform, Section I(A), Appendix (last visited Dec. 11, 2023).
266 Take It Down website, National Center for Missing & Exploited Children.
268 See, e.g., U.S. Dep’t of Justice, 2023 National Strategy for Child Exploitation Prevention & Interdiction, Unique Resource and Enforcement Issues Subject Matter Expert Working Group Report, at 7 (“Law enforcement also has difficulties when companies fail to provide any information in response to a valid search warrant, when tech companies provide the requested information in an unreadable format, when there is a significant delay in receiving information in response to a search warrant, ... .”).
269 For example, see the Technology Coalition’s “Project Lantern”.
270 Know 2 Protect, Department of Homeland Security.
271 In this context, red-teaming refers to the process by which a company (1) tests its generative AI platform to determine whether and how the platform can be used to generate CSEA material, and (2) uses that information to implement remedial measures to prevent further creation of such material.
272 See, e.g., National Strategy for Child Exploitation Prevention and Interdiction, A Report to Congress, Dept. of Justicep21; See also Subject Matter Expert Working Group Report on Livestreaming and Virtual Child Sex Trafficking at 9, (“Livestreaming platforms should pursue, and federal agencies support the development of, new technological services to detect initial captures and redistributions of livestreamed child sexual abuse online”).
273 See, e.g., Apple, App Store developers generated $1.1 trillion in total billings and sales in the App Store ecosystem in 2022 (May 31, 2023) (as of May 31, 2023, Apple’s app store had “nearly 1.8 million” apps available for download).
274 See, e.g., Minnesota Attorney General’s Report on Emerging Technology and Its Effects on Youth Well-Being, at 26 (Feb. 2024).
275 D.. Patton, R. Eschmann, and D. Butler, "Internet banging: New trends in social media, gang violence, masculinity and hip hop." Computers in Human Behavior, 29(5), A54-A59.
276 "Hate Is No Game: Hate and Harassment in Online Games," ADL (2022): (explaining that gamers age 10-17 may experience acts of discrimination that traditional chat moderation tools may not catch such as being excluded from joining a game/chat or being “griefed” or “trolled”).
277 M. Popa-Wyatt, "Reclamation: Taking Back Control of Words," (2020) (defining reclamation as “taking back control by targets of words used to attack them” and explaining that a reclaimed speech-act may be empowering rather than harmful as it strives to remove the harmful context of the initial use).
278 N. Duarte et al., "Mixed Messages? The Limits of Automated Social Media Content Analysis," at 19 (explaining how algorithms that struggle to identify socio-ethnic dialects have lower accuracy rates for those communities).
279 R. Kowert, and L. Moderation, "Challenges in digital gaming spaces: Prevalence of offensive behaviors in voice chat," Take This (Aug. 16, 2023): at 6 (defining the types of discriminatory speech moderated by “ToxMod” a resource that preemptively works to moderate chat so that users do not need to do the moderation themselves).
280 M. Al-Garadi, M. Hussain, N. Khan, G. Murtaza, H. Nweke, I. Ali, G. Mujtaba, H. Chiroma, H. Khattak, and A. Gani, "Predicting Cyberbullying on Social Media in the Big Data Era Using Machine Learning Algorithms," Review of Literature and Open Challenges, (2019): IEEE Access, 7, 70701–70718.
281 M. Hasan, M. Hossain, M. Mukta, A. Akter, M. Ahmed, and S. Islam, "Review on Deep-Learning-Based Cyberbullying Detection," Future Internet, 15, (2023): 179.
282 M. Dadvar, D. Trieschnigg, R. Ordelman, and F. de Jong, "Improving cyberbullying detection with user context. In Advances in Information Retrieval," Berlin, Germany: Springer, 2013, pp. 693–696.
283 National Academies of Sciences, Engineering, and Medicine, "Preventing Bullying Through Science, Policy, and Practice," The National Academies Press.
284 H. Clayton, G. Kilmer, S. DeGue, L. Estefan, V. Le, N. Suarez, B. Lyons, and J. Thornton, "Dating Violence, Sexual Violence, and Bullying Victimization among High School Students – Youth Risk Behavior Survey," United States, (2021): MMWR supp. 2023:72(1).
285 U.S. Department of Education, Office of Civil Rights, "Student Discipline and School Climate in U.S. Public Schools," ed.gov (page 16); Last reviewed, May 16, 2024.
286 "Student Discipline and School Climate in U.S. Public Schools," Civil Rights Data Collection, Office for Civil Rights, Department of Education, (Nov. 2023) at 16.
287 "The Ruderman White Paper Reveals: Students with Disabilities are Almost Twice as Likely to Be Victims of Cyberbullying," Ruderman Family Foundation (rudermanfoundation.org) (June 2019).
288 M. Al-Garadi, M. Hussain, N. Khan, G. Murtaza, H. Nweke, I. Ali, G. Mujtaba, H. Chiroma, H. Khattak, and A. Gani, "Predicting Cyberbullying on Social Media in the Big Data Era Using Machine Learning Algorithms," Review of Literature and Open Challenges. IEEE Access 2019, 7, 70701–70718.
289 M. Dadvar, D. Trieschnigg, R. Ordelman R, and F. de Jong, "Improving cyberbullying detection with user context." Advances in Information Retrieval. Springer, (2013): 693–696.
290 D. Finkelhor, H. Turner, and D. Colburn, "Which dynamics make online child sexual abuse and cyberstalking more emotionally impactful: Perpetrator identity and images?," Child Abuse & Neglect, Volume 137, (2023): (noting the emotional impact from sexual image misuse by peers, who made up a majority of offenders in this study, was just as great as with adult offenders).
291 M. Hasan, M. Hossain, M. Mukta, A. Akter, M. Ahmed, and S. Islam, "Review on Deep-Learning-Based Cyberbullying Detection," Future Internet, 15, (2023): 2, 179. (recommending Deep learning based cyberbullying detection systems, to detect cyberbullying).
292 M. Dadvar, D. Trieschnigg, R. Ordelman, And F. de Jong, "Improving cyberbullying detection with user context. In Advances in Information Retrieval," Springer, (2013), pp. 693–696 (recommending the inclusion of cyberbully specific features and terminology to detection tools to improve their capabilities).
293 N.A. Samee, U. Khan, S. Khan, M. Jamjoom, M. Sharif, D. Kim, "Safeguarding Online Spaces: A Powerful Fusion of Federated Learning, Word Embeddings, and Emotional Features for Cyberbullying Detection," In IEEE Access, vol. 11, pp. 124524-124541, 2023, doi: 10.1109/ACCESS.2023.3329347 (recommending the use of NLPs with to address cyberbullying concerns while preserving data privacy).
294 StopBullying.gov, "Bystanders to Bullying. U.S. Department of Health and Human Services," Last reviewed October 23, 2018. Effects of Bullying.
295 StopBullying.gov, "Cyberbullying Tactics. U.S. Department of Health and Human Services," Last reviewed May 10, 2018.
296 U.S. Department of Health and Human Services, "Preventing Cyberbullying," StopBullying.gov. Last reviewed November 10, 2021.
297 U.S. Department of Health and Human Services, "Social Media, Apps, and Sites Commonly Used by Children and Teens," StopBullying.gov.
298 U.S. Department of Health and Human Services, "Cyberbullying and Online Gaming," StopBullying.gov. Cyberbullying Tactics.
299 988 Suicide and Crisis Lifeline. Other resources include Know2Protect hotline, NCMEC CyberTipline reporting link.
300 Take It Down website, National Center for Missing & Exploited Children.
301 Pichel et. al., "Analysis of the relationship between school bullying, cyberbullying, and substance use," Children and Youth Services Rev. Vol. 134.
302 Pichel et. al., "Analysis of the relationship between school bullying, cyberbullying, and substance use," Children and Youth Services Rev. Vol. 134.
303 Kowalski et. al, "Racial Differences in Cyberbullying From the Perspective of Victims and Perpetrators," Amer. Jour. Of Ortho. (Identifying racial and gender differences in who perpetrates and experiences bullying).
304 Digital Wellness Lab at Boston Children’s Hospital, "Creating a Positive Foundation for Greater Civility in the Digital World," Boston Children’s Hospital.
305 See comment on NTIA KOHS RFC from CIPLat 64 (stating services providers are recommended to “promote parental controls that respect the child’s privacy and best interests), and The James Madison Institute comment at 7.
306 D. Schiano, and C. Burg, “Parental Controls: Oxymoron and Design Opportunity,” International Conference on Human-Computer Interaction, (2017).
307 Comment on NTIA KOHS RFC from FOSI (“Parents are overwhelmed by the many different types of parental controls, where to find them, how to use them, and what the tools do.”), Comment of ESA (stating that video game platforms and video games have their own parental control tools.)
308 L. Clark, “Digital Media and the Generation Gap: Qualitative research on U.S. teens and their parents,” Information, Communication, and Society, 12(3), 388-407, and Rasi, P.R., Vuojärvi, H., and Ruokamo, H., “Media Literacy Education for All Ages,” Journal of Media Literacy Education, 11(2) 1-19, 2019, and LGBT Tech comment (noting “parental control requirements that strip youth of any autonomy in their online experiences and therefore isolate or out LGBTQ+ youth”).
309 See comment on NTIA KOHS RFC from American Academy of Pediatrics (stating “the effects of social media on well-being are nested within family relationships and household dynamics, and one size-fits all mandatory parental control over teen media use is not developmentally appropriate for many families.”).
310 See comment on NTIA KOHS RFC from American Academy of Pediatrics.
311 See comment on NTIA KOHS RFC from 5 Rights at 2. (noting parental controls do not adequately address the problem as they are unable to address issues related to the interactivity of digital products and services).
312 See generally Google Families | Empowering kids to safely connect, play, and learn online, and Tips for Parents on Helping Your Teen Stay Safe on Discord
313 See, e.g., comment on NTIA KOHS RFC from Future of Privacy Forum (FPF)(pointing to some options where kids under age 13 have strict default settings, which can include “the disabling of purchasing, email marketing or push notifications, custom display names, communications with players using voice chat or free text chat, and recommendations based on past activity”).
314 Comment on NTIA KOHS RFC from Entertainment Software Association (“ESA”) at 8 (re in-game communications). For other measures, see, Tips for Parents on Helping Your Teen Stay Safe on Discord.
315 See, e.g., comment on NTIA KOHS RFC from Future of Privacy Forum (FPF)Comment of American Academy of Pediatrics on NTIA KOHS RFC and Tips for Parents on Helping Your Teen Stay Safe on Discord
316 Comment on NTIA KOHS RFC from American Academy of Pediatrics.
317 Comment on NTIA KOHS RFC from Roblox at 11.
318 Comment on NTIA KOHS RFC from #Shepersistedat 3 (“Platform design choices that impact minors specifically, such as ineffective parental control mechanisms, photo filters that alter a user’s physical appearance, a lack of mechanisms for restricting time spent on platforms, not providing labels for filtered content, weak systems for age verification, ineffective reporting systems for predatory accounts, and difficulties surrounding account deletion are examples of fair territory for oversight.)”
319 See Comment on NTIA KOHS RFC from #Shepersisted at 3.
320 See Comment on NTIA KOHS RFC from #Shepersisted at 3.
321 105-2 Hearing: S. 2326, Children’s Online Privacy Protection Act Of 1998 comment from then FTC Chairman Robert Pitofsky (stating “the practice of collecting personal identifying information directly from children without parental consent is clearly troubling, since its [sic] teaches children to reveal their personal information to strangers and circumvents parental controls over their family’s information.”). While the circumstances have changed since COPPA has passed, the sentiment remains the same: where parental controls are used, requests for access to youth data should go through parental control platforms.
322 See comment on NTIA KOHS RFC from Future of Privacy Forum (FPF).
323 See NTIA KOHS RFC ESA comment (stating “Understanding a game’s features and its age appropriateness begins before a caregiver purchases or downloads a game.) I.e. Social media and online platforms should learn from video games and provide proactive, easy to understand information about their features and age appropriateness.
324 See NTIA KOHS RFC App Association comment at 5 (stating “enabl[e] parental control settings on their children’s devices to make sure they do not have access to inappropriate information and reading privacy policies that the child likely does not understand due to their age.”).
325 C. Vogus, T. Greene, D. Martens, and G. Shmueli, "Improving Researcher Access to Digital Data: A Workshop Report," Center for Democracy & Technology, (2021)
326 T. Greene, D. Martens, and G. Shmueli, "Barriers for Academic Data Science Research in the New Realm of Behavior Modification by Digital Platforms," (October 20, 2021).
327 National Academies of Sciences, Engineering, and Medicine, "Social Media and Adolescent Health," The National Academies Press.
328 “Edmo Releases Report on Researcher Access to Platform Data.” EDMO. Accessed July 19, 2024.
329 Transparent and accountable online platforms | Shaping Europe’s digital future (europa.eu).
330 Status Report: Mechanisms for Researcher Access to Online Platform Data | Shaping Europe’s digital future (europa.eu)