How a Matchmaking AI Conquered (and Was Exiled) from Tinder

Credit to Author: James Jackson| Date: Mon, 06 Nov 2017 16:31:38 +0000

Forecast is a series exploring the future of AI and automation in a variety of different sectors—from the arts to city building to finance—to find out what the latest developments might mean for humanity’s road ahead. We’ll hear from Nikolas Badminton, David Usher, Jennifer Keesmaat, Heather Knight, Madeline Ashby and Director X, among others. Created by Motherboard in partnership with Audi.

It all started in a bar in 2014. Weary of watching his friends spend their nights swiping left or right on dating apps like Tinder, Justin Long of Vancouver decided to automate the process.

Within three weeks he had developed a working prototype, called Tinderbox, which used simple facial recognition software to find a match and craft an introductory message.

Within about eight months, the app had taken on a life of its own. News outlets from around the world, including the BBC, wanted to talk about it, Long said. He quit his job at a software marketing company and took a crash-course in deep learning AI. By 2016, he launched Bernie AI to the public, an improved version of Tinderbox named after a friend who died in 2009.

It was the latest example of how programmers and developers are starting to use AI and deep learning programs to try and play matchmaker for humans. But what’s the cost of these devices? Does it matter if your one true love doesn’t spot you from across the room, but instead you were a calculated choice made by an algorithm inside a handheld device that could just as easily be used to surf porn?

Bernie abruptly shut down in June, but tt wasn’t because people stopped using it; quite the opposite. Long told Motherboard that he keeps his actual membership numbers secret, but he claimed Bernie had made more than nine million actions (including swipes and conversations) and helped match more than 100,000 people.

The first few months showed incredible accuracy in gauging what people were looking for as well. According to Long’s blog, users indicated an accuracy rate of 99.86 per cent by reversing just 225 Bernie matches out of more than 164,000 actions made by the AI.

Bernie wasn’t shut down because of a lawsuit or other legal problems, but because Long’s major dating app provider, Tinder, simply decided they didn’t want Bernie around anymore, Long said.

Motherboard reached out to Tinder for comment on the shutdown of the Bernie app but did not receive a response.

One of the reasons Long developed Bernie in the first place was to cut through the noise of all the dating apps and help facilitate matches. The app used artificial intelligence, deep learning technology, and more advanced facial recognition than its Tinderbox predecessor. It eventually learned to identify other objects in the photo aside from the person’s face and could develop introductions based on those objects.

Part of Bernie’s success lies in its ability to quickly gratify our desire for human contact. That’s why dating sites and apps are so successful, according to Heather Knight, an assistant professor of robotics at Oregon State University.

“Finding romantic partnership or sex or whatever you’re looking for is a pretty natural human thing, and so it’s normal we would look to whatever tools we have at our disposal,” she said.

Long said one benefit of his AI technology in the dating scene is the increased availability of potential dates and possible partners. If we’re unhappy in a relationship, we may be more likely to leave if we know another date is right around the corner thanks to Tinder and apps like Bernie.

“We have more availability. I think people won’t want to date someone for as long as they normally would have because of this increased accessibility. Is that a bad thing?”

Knight, however, had a slightly different take, saying that much like pornography can skew our ideas around sex, having unfettered access to almost endless dating opportunities can also change our concept of the perfect partner.

“What happens is we get this unjustified higher valuation of ourselves … so people aren’t even happy when they find someone who should have been a good match,” she said.

But that isn’t to say there isn’t a place for AI technology like Bernie in the matchmaking world. What’s important is for the technology to get two people into the same room, then let the natural chemistry (or lack thereof) take over, according to Knight.

“The idea we can use [the tech] to bring people together in physical space is fantastic. Even though it might start with things that are motivated by sex and dating, it’ll start to happen in other spheres of our life,” Knight said. For example, a friend of hers who recently moved to California signed up for Tinder with the explicit message she was only looking for friends to hang out with.

While Bernie may have shut down, other apps have stepped into the void. Hily takes a slightly different approach in that it disregards visual attractiveness as the basis of its search and instead looks for similar interests, word choice, and mutual likes. It had approximately 35,000 users in a closed beta earlier this month, according to TechCrunch.

Other times, the apps can get downright creepy. The Dating AI app takes celebrity stalking to a whole new level by browsing dating sites with similar facial recognition software in search of celebrity doppelgangers you’ve said you’re attracted to. It’s unclear how long the app will remain, however, as earlier this year Tinder informed Dating AI the app violates the company’s terms of use.

Knight said it’s reasonable to expect AI to thrive in the dating app scene, since that space is predominantly non-verbal and doesn’t rely on the skills AI has yet to master. “No tonality, no facial expressions, all you have is just words,” she said.

One point of contention for Knight was the fact that people who were contacted by Bernie didn’t know it was an AI program talking to them, which is a violation of trust, she said.

“If you’re thinking of dating someone and it starts in a place of deception, that’s not an AI issue. That’s a social issue,” she said.

Long didn’t see any problem with the lack of disclosure.

“I don’t think it’s really necessary. I think people understood what was happening,” he said. “I did have a few hate emails about how it was making the world worse, but they were limited.”

https://motherboard.vice.com/en_us/rss