FACT SHEET: Biden Administration Outlines Industry Guidance and Family Resources to Support Youth Safety Online
Today, the Biden-Harris Administration’s Task Force on Kids Online Health & Safety issued a set of practical strategies to promote the health, safety and privacy of youth online. The Task Force calls on industry to take 10 critical steps to help young people safely navigate the digital ecosystem.
The Task Force, co-led by the National Telecommunications and Information Administration (NTIA) and the Substance Abuse and Mental Health Services Administration (SAMHSA), found that design choices made by industry cause many of the harms young people experience online. This report urges industry to make design choices that prioritize kids’ well-being.
Design age-appropriate experiences for youth users.
- Consider the varied experiences of youth users across different demographics, developmental stages, needs, anticipated uses, home environment, and experience level.
- Identify features that may be beneficial or neutral in some contexts but harmful in others in product design and risk mitigation efforts (e.g., features designed to encourage daily use in an education app versus a gaming or chat app).
- Engage youth, parents and caregivers, relevant stakeholders, and experts throughout the product development lifecycle. This should include young people from different backgrounds and age groups.
Make privacy protections for youth the default.
- Strictly limit the collection of minors’ data and personal information. This includes collecting only the personal information necessary for participation in service.
- Turn off direct messaging for minors by default, but allow teens to opt into this feature.
- Minors’ accounts should not be publicly discoverable by default.
- Do not enable targeted advertising to minors based on their activities, whether on or off the platform.
- Do not enable by default content recommendation algorithms that are based on personalized profiling of a minor.
Reduce and remove features that encourage excessive or problematic use by youth.
- Minimize unnecessary notifications. This includes muting notifications during certain times, such as during school hours or at night.
- Populate feeds chronologically by default, or consider other measures to replace infinite scroll, which places an emphasis on prolonged engagement.
- Minimize addictive features of mobile games, such as loot boxes, and other incentives that lure people into playing longer due to fear of loss of a reward or other impact to gameplay.
- Set longer time limits for stories or ephemeral content, to reduce the sense of urgency that keeps users unnecessarily engaged for fear of missing out. Alternatively, make them available until a user accesses them within a longer set time period.
Limit “likes” and social comparison features for youth by default.
- Hide by default the visibility of the number of connections, friends, or followers for a minor’s account or piece of content.
- Cap or remove by default – as appropriate for age – likes and related emojis, views, dislikes, or other interactions for a minor’s posts and others’ posts that the minor views.
Develop and deploy mechanisms and strategies to counter child sexual exploitation and abuse.
- Join the National Center for Missing and Exploited Children’s (NCMEC) “Take it Down” initiative that helps children and youth under 18 privately and anonymously seek the removal of intimate images shared without their consent from online platforms.
- Implement measures to detect and respond to grooming language.
- Explore ways to identify and disrupt problematic online interactions between adults and minors, or between minors.
- Implement best practices and develop guidelines for AI red-teaming and other measures throughout the AI development life cycle to reduce the risk of AI models from being used to generate child sexual exploitation and abuse.
Disclose accurate and comprehensive safety-related information about apps.
- Inform parents of the possibility of communication between adults and children on the app.
- Develop and inform parents about age verification tools built into the app or available at the device level.
- Maximize protections on devices belonging to minors.
Improve systems to address bias and discrimination that youth experience online.
- Deploy and improve the use of manual and automated moderation of discriminatory content and activity.
- Prioritize moderation systems for features that pose the highest risks of discriminatory conduct against young people (e.g., voice chat in gaming).
- Evaluate automated content moderation tools for bias and discrimination, including across multiple languages.
Use data-driven methods to detect and prevent cyberbullying, and other forms of online harassment and abuse.
- Allow muting/blocking of problematic users, even if the behavior does not rise to the level of violating platform policies.
- To reduce the risk of image-based sexual abuse among peers, employ image-blurring technology so that users only view images they consent to receive.
- Identify specific groups that may be more likely to experience cyberbullying to develop tailored interventions for that online platform.
Provide age-appropriate parental control tools that are easy to understand and use.
- Parental control tools could include: supervised accounts, limits on interactive functions like chats, viewing browser search history, limiting and blocking contacts, and time limits, among others.
- Make it easy for parents to change or reconfigure monitoring and controls as their children age.
Make data accessible for verified, qualified, and independent research. The types of data include:
- Usage data, such as the number of minor users and their time spent online, including how certain features and designs increase or decrease time spent.
- Privacy and account settings chosen by users.
- Use of algorithmic and process-based recommendation systems, including actual personalized recommender systems where appropriate and the data used in the systems.