CASE STUDY: More Dates, Less Creeps
Using gamification to guard against dating app abuse while maintaining user privacy.
Background
Dating apps continue to gain prominence, particularly during the pandemic when face-to-face dating was limited. When matches are found, early communication is generally done through the app's private chat feature.
Challenge
Can a dating app allow for protections against online abuse without sacrificing privacy?
Research revealed a common user frustration among a specific user persona: Despite attempts of large dating app makers to build in security against abuse, female users experience lots of unwanted attention. For this reason, abandoning dating apps is high.
One interviewee summarized the issue:
"The problem with most dating apps is that female users get a lot of attention – usually the WRONG kind of attention."
A possible solution is hiring moderators who can help police the app, but that can be cost-prohibitive and raise privacy concerns. Another option would be using artificial intelligence to find harassers, but A.I. is also expensive and doesn't always have the nuance required to make the appropriate decisions, potentially driving up user frustration.
Solution
Gamification strategy, a way of motivating and engaging users by establishing rules common to game play, was presented as a potential solution. A community policing model could prove effective as patterns of harassment could be identified and purged from the system. This model was well received in a focus group, and early user testing showed a decrease in support calls.
Implementation:
- User must accept a code of conduct: "The Rules of the Game."
- A user can Report another participant who break the rules.
- If three separate people report a user, they get a "TIME OUT."
- After three "Strikes" the offender is removed.
Inversely, a reward system was developed for good community behaviors that provides badges or other useful feature perks.
RESOLUTION
The mechanisms put in place continue to identify and eliminate serial harassers. The need for human moderators remains, but greatly reduced. User privacy is maintained.