You Had Me at ‘Has Never Filed for Bankruptcy’

Tinder is trying to make it easier to obtain data on potential partners. That could create more problems than it solves.,

Advertisement

Continue reading the main story

Supported by

Continue reading the main story

What does it mean to gather “verified” data on potential romantic partners? There’s something to be said for the idea that intimacy is based on having discretion to share information with others — on deciding how much of yourself to reveal to someone, and when, and how — as trust builds in a relationship.

Match Group — which owns dating and hookup platforms including Tinder, OKCupid and Match.com — is trying to make it easier to obtain data on potential partners. The company announced this month that it would help users run background checks on potential dates. Tinder users will be the first to receive the feature, which will allow them (for a fee not yet determined) to obtain public records on a match, based only on first and last name, or a first name and phone number.

That data, provided by a nonprofit company called Garbo, will include “arrests, convictions, restraining orders, harassment, and other violent crimes” in order to “empower users with information” to protect themselves. Garbo’s website also indicates that it accepts evidence submitted directly by users, “including police reports, orders of protection and more,” though it’s not clear whether this feature will be integrated into its arrangement with Match.

Gender-based violence is a serious and prevalent problem, experienced by one in four women and one in nine men at some point. Intimate platforms have come under fire for their lack of action when users report being assaulted by someone they met through the service.

Potential partners sometimes deceive each other, in ways both trivial and significant. So it’s no surprise that many people already take steps to check up on others before meeting in person — doing searches of names on Google, perusing social media profiles, even in some cases running formal background checks.

It’s laudable that Match Group wants to prevent its platforms from propagating sexual violence, and it’s attractive to try to fix the problem with technology. But we should be clear about the trade-offs. Technological measures that make us seem more secure may not always be as effective as they seem — and they can introduce a host of concerns around privacy, equity and the process of trust-building required for true intimacy to develop. If we normalize the practice of building a dossier of external data points on a person to avoid the risk of deception, we might upend an important aspect of creating close connections.

The risks associated with meeting potential partners stem in part from the way we tend to pair up today. Before the emergence of intimate platforms, more people met through common connections. In those cases, you had some sense of knowledge about the person — he’s a friend of a friend, I know where she works — that allowed for inferences about the person and a degree of comfort about interacting.

Intimate platforms have changed the game: We increasingly meet online. And we may believe a digital record to be a full, “true” representation of someone. But these kinds of records are known to be far from perfect, especially when they rely on names to match records because records are often misattributed to people with the same or a similar name. They commonly include criminal convictions that were later expunged or charges that were ultimately dropped. It can be difficult for people with inaccurate records to become aware of them, and it’s sometimes impossible to obtain removal of errors or inconsistencies.

Moreover, a truly motivated bad actor can often circumvent policies like these by using a different name or phone number. So even to the extent that background checks appear to provide security, they can function more like a security blanket — they might give us the feeling of safety without actually ensuring it.

There’s also substantial social value in letting people shed stigmatizing or embarrassing information in these records. That is the rationale behind “ban the box” policies, which prevent employers from asking about criminal history on job applications in order to give applicants a fair chance at being hired. Letting people with stains on their records reintegrate into social life — including intimate relationships — has important social benefits.

Also, because data collection is often racially disproportionate — particularly in the context of involvement with the justice system — we should be mindful of who is most likely to be affected by policies like these. Match and Garbo have shown some foresight here: In recognition of the discrimination faced by Black Americans in the criminal justice system, they exclude drug possession offenses and traffic offenses (aside from D.U.I.s and vehicular manslaughter) from their background checks.

But even with these exclusions, over-policing of people of color, and racial bias present in all stages of the criminal justice system, should give us significant pause when drawing on criminal justice data. We should be especially careful about integrating these records into intimate platforms, which can be sites of racial exclusion and race-based harassment.

It’s not hard to imagine how background checks might open the door to other kinds of data. Do we want to start vetting our partners in the same way we decide what kind of car to buy, or whom to hire, or who is likely to repay a loan? Should I know whether someone has filed for bankruptcy or been married before or owns property? Should I be able to sort partners by their credit score? Introducing this level of data use into the intimate sphere seems at odds with how we typically learn about one another — gradually, and with the benefit of context.

Match Group is trying to address a real, urgent problem — but we need to be very thoughtful about what tools are appropriate to combat sexual assault and what impacts they might have on user privacy and on how we develop relationships. Using data as a weapon against sexual violence can introduce more problems than it solves.

Karen Levy (@karen_ec_levy) is an assistant professor in the department of information science at Cornell University.

The Times is committed to publishing a diversity of letters to the editor. We’d like to hear what you think about this or any of our articles. Here are some tips. And here’s our email: letters@nytimes.com.

Follow The New York Times Opinion section on Facebook, Twitter (@NYTopinion) and Instagram.

Leave a Reply