No matter how good a user experience designer is at her job, it’s almost impossible to predict what mobile consumers will do with your app once it’s out of your hands — try as you might, you can’t prevent people from using your product in unexpected and sometimes immoral ways.
The creators of Nextdoor know this all too well. Launched in 2010, Nextdoor is a free private social network on which neighbors and community members can communicate with one another in gated discussions and threads. The site aspires to facilitate better neighborhood safety by providing a platform for people to organize everything from block-wide yard sales to a neighborhood watch group. Users set up a profile with the site, confirm their address, and then have the ability to post classified ads or announcements that might interest the wider community.
Soon after its launch, however, the designers noticed a problem with how some not-so-friendly neighbors were using the app. While the babysitter organizing and HVAC recommendation platforms were relatively benign, the platform’s crime and safety section — where users were filing incident reports based on the color of a “suspicious person’s” skin, and little else — sounded their alarms.
In March of 2015, Fusion published an article exposing the way Nextdoor had become a home for racial profiling. The story detailed how the app, while a great tool for assisting with crimespotting, was also facilitating some of the same bigotry we see playing out in communities all across the country. “Rather than bridging gaps between neighbors, Nextdoor [was becoming] a forum for paranoid racialism,” wrote Pendarvis Harshaw, calling it “the equivalent of the nosy Neighborhood Watch appointee in a gated community.”
Nirav Tolia, Nextdoor’s chief executive and co-founder, was floored when the article came out. “I’m a person of color, so it really cut deep,” Tolia said in a recent interview with Fusion, for an updated story about his company. “We hated the idea that something we built would be viewed as racist… [and] I was powered by the challenge to do something about it.”
Inspired by the work of Stanford psychologist Jennifer Eberhardt, who studies how race can influence the judicial system and trains police officers to recognize and overcome their own biases, the Nextdoor team brainstormed ways to redesign the app in order to combat implicit opportunities for racism.
To address the problem, Nextdoor broke a cardinal rule of contemporary user experience design: they added friction to the interface of the platform. Rather than minimizing the steps to complete a process or reducing the number of clicks necessary to share a post, they actually added steps, making incident reporting slightly more complicated.“We tried to create decision points,” said Tolia to Fusion, “to get people to stop and think.”
As Wired explains, the new interface uses a series of checkpoints to help users evaluate the content of their report. The moment you begin to create a crime and safety message, before a box even opens to write in, the platform prompts users to say whether the message is about an actual crime or “suspicious activity.” If a user selects “suspicious activity,” a list appears, reminding them to focus on behavior and not appearance when reporting a suspicion, and even explicitly instructs users not to “assume criminality based on someone's race or ethnicity.”
From there, users are guided through the completion of a suspicious activity report, and any time race is mentioned, the user hits a roadblock. When describing what a person looks like, there are eight description boxes. If content is included in the race box, an algorithm requires that at least two other boxes be filled out. Nextdoor has also added a “racial profiling” category for people who want to flag a conversation as inappropriate, and they provide additional training to “neighborhood leads” — volunteers who moderate discussions — on how to recognize toxicity in conversations involving race.
By making the app less intuitive, Nextdoor was able to reduce racist user behavior almost immediately. According to a study completed by Nextdoor in the process of restructuring the platform, “there was a 75% reduction in racial profiling posts by people using the new version of the site, compared to those in the control group who were still using the original version of the site.” The company has since rolled out the new version to all of its users. While it may be impossible to change someone’s attitude about race, it’s certainly possible to try to change their actions.
When new app developers sit down to create an equitable platform, the last thing they want to be accused of is designing an app that encourages racism, sexism, or any other bigoted view.
To combat this unsavory byproduct, sometimes it’s beneficial for tech and app designers to build decision or deterrent points into a platform. While it may prevent users from posting content, if the ultimate goal is to maintain an equitable app, such moments of “self checking” actually encourage people to stop and think about their own biases. Get started on your own, bias-free app today with AppMakr, the original #makeanapp DIY app platform.
Latest posts by Jamie Ayers (see all)
- When It Comes to Campaigning Apps, A Clear Winner Emerges - October 27, 2016
- Emerging Markets Are Essential to App Developers — Here’s Why - October 13, 2016
- Out-of-the-Box UX Design Can Eradicate Racist User Behavior — Just Ask ‘Nextdoor’ - September 29, 2016