This dating app fights scammers with bots… Happiness is created – TechCrunch


You know the feeling; You’re on a dating app, and you start a conversation with someone, and things just don’t seem to add up. You’ve seen The Tinder Swindler on Netflix, and suddenly you’re thinking: Is someone going to ask me for money? The team at video dating site Filter Destroyer decided to take a new approach: every scammer found was added to a side-pool of dating hopefuls who created their own chatbots as attractive singles and other scammers only. As you might expect, hilarity ensued.

As a platform, Filter Off is the video-first dating app that launched at the beginning of the Covid-19 lockdowns. As dating shifted from bars and galleries and picnics to more conversational—and video-first—the company introduced virtual speed-dating on a variety of topics. Harry Potter date night, dog lovers date night, New York City date night – you name it. The platform has hundreds of thousands of users, and as it grew in popularity among people looking for love, the founders discovered that it attracted a second set of people – people looking for money.

“The first time I saw a problem was when I saw that George Clooney had joined Filter Off. I was like, ‘Holy shit, like, this is wild, I can’t believe it…’ but then I took a closer look at his profile,” said Filter Off head of product Brian Weinrich in an interview with TechCrunch. He replied. Clooney realized he probably wasn’t on a dating site, and if he was, he wouldn’t be a 34-year-old from Lagos, Nigeria. “I deleted their profiles to get them off the app. Then I started to notice all these profiles that looked like real people, but they were all un-enhanced profiles.

The product team decided to try to fix the problem with an algorithm, and created software that identifies users and “based on certain characteristics that I started to notice, how people register using the app,” they are more likely to be fraudsters. The team continues to delete the profiles, but for every cheater they cut, five more pop up in their place, Medusa-style.

“I was like, we need a way to get rid of cheaters, but it has to be in a way where they can’t come back and join the scene,” Weinrich said. “I remember Reddit and other platforms having some kind of ‘blocking shadow’ where users can continue to post, but normal users don’t see their content.”

And so the work began. The team used GPT-3 to create several chatbots, along with a script that generated human-like faces and created several fake profiles. Note: These profiles are not visible to “regular” users, only to people the algorithm has determined are scammers. It puts them in a pool of thousands of bots that look and speak like real people.

“TIn the funny part, two things happen. First, the cheaters run into other bots, but they run into other cheaters, and they’re trying to cheat each other,” Weinrich laughed. They’re like, ‘I want $40 for a Google Play gift card,’ and the other scammer replies, ‘No, no, give me a gift card.’ I‘ And they were just arguing. It’s funny that we now have over 1,000 cheaters in our app that I know. it’s great. They are wasting their time and don’t need to interact with our real users.

The platform has reporting and blocking features where genuine users can report scammers. When reported, the team can improve the algorithm and also manually place the fraudsters in the bot pool.

“TThe funniest thing about our reporting feature is the number of reports I get where scammers are talking to bots. And I’m like, ‘Yeah, I know, that’s the point.'” Weinrich scoffed.

The company has collected some of the most ridiculous conversations scammers have had with each other and with bots on blogs, which are also worth a visit.



Source link

Related posts

Leave a Comment

2 × one =