Social Network Nextdoor Moves To Block Racial Profiling Online | KERA News

Social Network Nextdoor Moves To Block Racial Profiling Online

Aug 23, 2016
Originally published on August 23, 2016 9:35 am

Think before you post.

That's not the message you typically get from Internet companies. The ethos on Facebook, Twitter and Instagram is to (over) share. But Nextdoor, a social network, has decided to block users from publishing certain posts, specifically when they appear to be racial profiling.

A techie tackles race

Talking about race and racial profiling does not come naturally to Nirav Tolia, the CEO of Nextdoor. And yet, he's doing it anyway.

"What someone considers to be racist is something that is, unfortunately in many cases, in the eye of the beholder," he says. "Why do some people like Trump and some people think that he's Satan?"

Tolia is a tech entrepreneur, not a politician. Nextdoor is a popular social network for neighborhoods. You use your real name and address to join an online group with your real neighbors.

The company is confronting a tough problem: How do you stop an activity when people can't even agree on how to define it? Jaywalking and speeding are easy. Racial profiling does not have a universally accepted definition, as criminology experts note.

In the face of public criticism by users who felt the site was permitting racism and fear mongering, Nextdoor decided to create a working definition that is relatively broad: anything that allows a person to stereotype an entire race. And throughout this summer, in a move that's highly unusual for a tech company, Tolia and his engineers have been testing ways to put a stop to it online.

People engage in racial profiling "often not on purpose," Tolia says. It's implicit bias. For example, he says, a user might think: "If I look out my window, and I see someone breaking into a car, and the only thing I see is that they're dark-skinned, why can't I post [it]? That's all I see."

The problem with that post — "a dark-skinned man is breaking into a car" — is that, while the activity sounds like a crime, the description of the alleged perpetrator lacks any useful detail, like what he was wearing, his sneakers, his hairstyle or height.

"Because that message goes out to the entire neighborhood, where presumably many of the neighbors reading the post are dark-skinned, that would be considered racial profiling," Tolia explains.

Nextdoor was no stranger to such posts. The end effect, he says, was more hurtful than helpful, generating animosity among neighbors, rather than useful tips for law enforcement.

How it works

In a pilot project running in select neighborhoods across the U.S., the company has altered the rules for posting. When a user goes to post about a crime or suspicious activity, in the Crime & Safety section, a new form requires two physical descriptors — e.g. Nike sneakers, blue jeans, crew cut, brunette — if the user chooses to include the race of the person.

An algorithm under development spot checks the summary of the suspicious activity for racially charged terms, as well as for length. If the description is too short, it is presumed to lack meaningful detail and is unacceptable.

If a draft post violates the algorithm's rules or the form's mandatory fields, the user has to revise. Otherwise, it's not possible to post.

"This is a very, very, very difficult problem in society," Tolia says. "Do I believe that a series of forms can stop people from being racist? Of course I don't. That would be a ridiculous statement."

Fear of friction

The move to block posts sparked heated internal debate, Tolia admits. "It's highly unusual for a social network to say: If you don't do this, you cannot post. Highly unusual. I mean, think about Twitter or Facebook or Snapchat. There's no friction at all in the process of posting."

In tech, "friction" is a dirty word. Engineers rack their brains over how to shave seconds off the time it takes to broadcast you to the world.

Some Nextdoor engineers argued that the company should just politely suggest, not require, a better description. They pointed out that when people complain — about bullying, hate speech, revenge porn — on other social networks, those companies don't change their product.

"They may write a blog post, they may make a donation to charity, something like that," Tolia says.

Thus far, the company says there's been roughly a 50 percent reduction in racial profiling posts. Tolia's goal, he says, is to drive the number of instances down to zero.

Backstory: A local campaign

There's an interesting backstory here. Ultimately, it was a sustained grass-roots campaign in Oakland, Calif., that compelled the tech company to act.

A group called Neighbors for Racial Justice met with Nextdoor and handed over a blueprint for how to change the platform. Then, they got city officials to weigh in aggressively. For example, at a hearing last December, City Council member Desley Brooks said that if the company doesn't take steps to stop racial profiling, "we as a city ought to say that we will not allow our employees to continue to post on Nextdoor and validate this poor behavior."

Nextdoor recruits police and city agencies into the network. They're an added feature, a kind of Community Policing 2.0 that many users want. In the wake of the Dallas shootings, the police department there turned to Nextdoor to communicate safety updates to residents, and later to recruit for the police force. The network says it's partnering with more than 1,600 public agencies in the U.S.

Oakland Council member Annie Campbell Washington says at first Nextdoor employees involved in the discussion weren't willing to fundamentally alter the product. That changed when the CEO stepped in.

She says it's a rare win in Silicon Valley, to get a company to ask users sitting behind their screens "to think about the person on the other side of the screen who's of a different race, a different ethnicity, and think about how that post may affect their lives."

Some residents worried the grass-roots campaign was just the PC police. Campbell Washington recalls people writing in with questions: "Why would you engage in anything that limits people's expression? And especially people who are trying to keep their neighborhoods safe."

Then, the regular police weighed in. Oakland Lt. Chris Bolton says he would "much rather" have a detailed description about a factor that is "very unique" — the man who robbed me was wearing tennis shoes with red laces — than a vague description of just the sex and race of a person. He says the changes make Nextdoor more, not less, helpful for real police work.

Nextdoor plans to roll out changes to its entire U.S. network in the coming weeks.

Copyright 2018 NPR. To see more, visit http://www.npr.org/.

RENEE MONTAGNE, HOST:

Think before you post. That's not the message you typically get from Internet companies. The whole point of Facebook, Twitter and Instagram is to share everything. But now, there is one social network - it's called Nextdoor - that has decided to block certain posts when they appear to be racial profiling. NPR's Aarti Shahani reports.

AARTI SHAHANI, BYLINE: Talking about race and racial profiling does not come naturally to Nirav Tolia, the CEO of Nextdoor, but he's doing it anyway.

NIRAV TOLIA: This is a very, very, very difficult problem in society. Do I believe that a series of forms can stop people from being racist? Of course I don't. That would be a ridiculous statement.

SHAHANI: Tolia does believe he can make Nextdoor a more thoughtful place by changing how people post. Nextdoor is a popular social network for neighborhood. You use your real name and address to join an online group with your real neighbors. And the company is tackling a tough problem - how do you stop an activity when people can't even agree on how to define it? Jaywalking, speeding - those are easy. Racial profiling does not have a universally accepted definition. Tolia offers a working one.

TOLIA: Anything that allows a person to stereotype an entire race.

SHAHANI: Some Nextdoor users do that a lot, especially when posting about a possible crime. And they don't realize it.

TOLIA: In many cases, people say, well, if I look out my window and I see someone breaking into a car, and the only thing I see is that they're dark-skinned, why can't I post a dark-skinned man is breaking into a car? That's all I see.

SHAHANI: The activity sounds like a crime, but the description of the guy lacks any useful detail, like what he was wearing, his sneakers, his hairstyle or height.

TOLIA: Because that message goes out to the entire neighborhood, where presumably many of the neighbors reading the post are dark-skinned, that would be considered racial profiling.

SHAHANI: This kind of post kept happening on Nextdoor, so they decided to put a stop to it by changing the product. Now, when you type in about crime and safety, you can't just say, sketchy guy hanging around building or standing on corner. The algorithm prompts you to describe the criminal activity. If you mention race, you have to give other details. Otherwise, you cannot post.

TOLIA: And there was a lot of debate around that internally.

SHAHANI: Why?

TOLIA: Because it's highly unusual for a social network to say, if you don't do this, you cannot post - highly unusual. I mean, think about Twitter or Facebook or Snapchat. I mean, there's no friction at all in the process of posting.

SHAHANI: In tech, friction is a dirty word. Engineers wrack their brains over how to shave seconds off the time it takes to broadcast a post across seven continents. Tolia admits there was a lot of internal conflict. Some employees said Nextdoor should just politely suggest, not require better descriptions. Even if users complain about bullying, hate speech, revenge porn, other social networks don't change their product.

TOLIA: They may write a blog post. They may make a donation to charity, something like that.

SHAHANI: There's an interesting backstory here. Ultimately, it was a grassroots campaign in Oakland that compelled the tech company. Listen to this threat from city councilwoman Desley Brooks at a hearing last December.

(SOUNDBITE OF ARCHIVED RECORDING)

DESLEY BROOKS: We, as a city, ought to say that we will not allow our employees to continue to post on Nextdoor and validate this poor behavior.

SHAHANI: A group called Neighbors for Racial Justice met with Nextdoor and handed over a blueprint for how to change the platform. Then they got local lawmakers on board. Nextdoor recruits police and city agencies into the network. They're an added feature - a kind of community policing 2.0 that many users want. Councilwoman Annie Campbell Washington says some residents worried this was just the PC police. They wrote her.

ANNIE CAMPBELL WASHINGTON: Why would you engage in anything that limits people's expression, and especially people who are trying to keep their neighborhood safe?

SHAHANI: Then the regular police weighed in. Oakland Lieutenant Chris Bolton.

CHRIS BOLTON: We would much rather have a very detailed description about a factor that is very unique, such as, you know, the man who robbed me was wearing tennis shoes with red laces, than I would a very vague description of just, you know, perhaps a sex and race of a person.

SHAHANI: He says the changes make Nextdoor more - not less - helpful for real police work. Aarti Shahani, NPR News, Oakland. Transcript provided by NPR, Copyright NPR.