Misplaced Hate Makes Disgrace: Changes Are Addressing Online Racial Profiling (Audio)
Nextdoor is a social-media platform that connects people with their real-life neighbors and cultivates an online community where residents can communicate with one another. It serves as a virtual town hall, community bulletin board, and crime-watch group, but because it functions as an extension of real-world functions, there are some growing pains along the way. The company recently changed its product to address the problematic issue of racial profiling, and the changes its made could help elicit a nationwide conversation about how the internet can address the harmful practice.
Today (August 23), NPR interviewed the company’s CEO Nirav Tolia in a piece titled “Social Network Nextdoor Moves To Block Racial Profiling Online.” In it, Tolia defines racial profiling as “anything that allows a person to stereotype an entire race,” and he acknowledges that it is a very difficult problem in society. That problem had migrated to the social network, where community residents were repeatedly painting with broad, racially tinged strokes when describing a potential crime in their neighborhoods. Nextdoor allows residents to fill out a form when reporting something suspicious, such as somebody breaking into a car. However, in its original form, the process did not require Nextdoor users to include specific details.
Problems began to repeatedly arise in this section of Nextdoor. For instance, saying something like “I saw a dark-skinned man breaking into a car” without details about what he was wearing, his hairstyle, or other distinguishing features is problematic. It puts community members on high alert for anyone who has dark skin, which is not only an ineffective way to fight crime, but it promotes prejudicial action, consciously or otherwise.
As Tolia admits, “a series of forms can’t stop making people racist,” but the changes to the platform are making it much harder for Nextdoor users to partake in racial profiling. Now, when a user logs in to type in about crime and safety, you can’t use descriptions that are vague. “The algorithm forces users who mention race to give other details, otherwise one cannot post,” NPR reports.
The tech company was compelled to change its product because of a grassroots campaign in Oakland, California. “A group called Neighbors for Racial Justice met with Nextdoor and handed over a blueprint for how to change the platform. Then, they got city officials to weigh in aggressively,” according to NPR. What’s happening in Oakland is already having an effect in other cities, too. “Nextdoor recruits police and city agencies into the network. They’re an added feature, a kind of Community Policing 2.0 that many users want. In the wake of the Dallas shootings, the police department there turned to Nextdoor to communicate safety updates to residents, and later to recruit for the police force. The network says it’s partnering with more than 1,600 public agencies in the U.S.,” reports NPR.
Oakland Council member Annie Campbell, in her appreciation of Nextdoor’s added features, described something that could very well help other online platforms think of how to combat online racial profiling. Campbell says it’s rare for a company to encourage users of its product “to think about the person on the other side of the screen who’s of a different race, a different ethnicity, and think about how that post may affect their lives.”
In the coming weeks, Nextdoor hopes to incorporate the added features to its communities across the country.
As Nextdoor’s nationwide rollout begins and its thousand of users begin to approach the reporting of crime differently, one can only begin to think of ways to implement similarly aggressive changes in other realms of life where racial profiling is harmful. Racial profiling in police departments is a systemic plague happening in the U.S. now, but the Department of Justice’s recent scathing findings about the Baltimore P.D. could very well prove to be a place to start in combatting the issue in the streets.