The National Telecommunications and Information Administration (NTIA) requests comments addressing issues at the intersection of privacy, equity, and civil rights. The comments, along with information gathered through the three listening sessions that NTIA held on this topic, will inform a report on whether and how commercial data practices can lead to disparate impacts and outcomes for marginalized or disadvantaged communities.
The security and privacy landscape has continued to evolve since NTIA first asked about it in our 2015 Internet Use Survey. High-profile data breaches and debates about the role of technology in people’s lives have kept concerns about privacy and security in the forefront. The spread of emerging technologies such as smart home devices and always-on voice assistants, as well as business models predicated on the collection, use, and sale of personal information, means these concerns have taken on increased urgency.
As NTIA will be exploring in our listening sessions this week, these concerns are especially acute for those in marginalized or underserved communities. These communities can sometimes face higher risks of harm from the loss of privacy or misuse of data.
In 2019, most Internet-using households in America expressed concerns regarding digital privacy, according to data from the NTIA Internet Use Survey. While fewer households had concerns about digital privacy and security and deterred online activities in 2015 vs. 2017, rates have held steady from 2017 to 2019. In 2019, 73 percent of Internet-using households in 2019 had significant concerns about online privacy and security risks, and 35 percent said such worries led them to hold back from some online activities (see Figure 1).
Every day, personal information is used to make important decisions: about what advertisements we see, what types of health care is offered in our communities, and what fields of study our educational institutes believe we are best suited for.
The collection, processing, and sharing of personal information can create serious risks for everyone. For racial minorities, people living with disabilities, people living in poverty, and other marginalized and underserved communities, the risks can be especially acute.
For example, advertisers can both intentionally and inadvertently use digital tools that allow for harmful discrimination in ad targeting, potentially reproducing historical patterns of discrimination in areas such as housing or employment opportunities. Even when targeting criteria does not directly use traits such as race or gender, proxy indicators of these characteristics can nonetheless perpetuate discrimination.
The Biden Administration has made it a clear policy priority to advance racial equity and support underserved communities. As public policy discussions around privacy continue to advance, it is apparent that robust privacy protections are critical to achieving this goal.