Dating apps crack down on romance scammers.


According to Michael Steinbach, head of global fraud investigations at Citi and a former executive assistant in the FBI’s National Security Branch, fraud generally ranges from “high-volume card theft or getting a lot of information quickly to a lot of things,” where fraudsters spend a lot of time conducting surveillance and sophisticated social engineering. Dating apps are part of a global scam, he added, and a significant amount of fraud still occurs. But for scammers, he says, “If you can spend the time to earn the trust and confidence of your victim, the rewards are enormous.”

Steinbach says he advises consumers to approach certain interactions with a healthy dose of skepticism, whether it’s a banking app or a dating app. “We have a catchphrase here: don’t take the call, make the call,” Steinbach said. “Most scammers, no matter how you put it together, are reaching out to you unsolicited. Be true to yourself; If someone seems too good to be true, they probably are. And keep conversations on a platform until real trust is built — in this case, on a dating app. According to the FTC, 40 percent of dating scam loss reports include “detailed narratives” (at least 2,000 characters long) that refer to the conversation being moved to WhatsApp, Google Chat, or Telegram.

Dating app companies have responded to scams with manual tools and AI-powered engineering to identify potential problems. Many matchmaking apps use photo or video verification features that encourage users to take pictures of themselves directly within the app. Instead of letting someone upload a photo whose descriptive metadata was previously captured, they work in machine-learning tools to determine the authenticity of the tag. (A WIRED report on dating app scams dating back to October 2022 noted that at the time, while Tinder did, Hinge did not have this verification feature.)

For an app like Grindr, which serves predominantly men in the LGBTQ community, the tension between privacy and security is greater than it is for other apps, says Alice Hunsberger, Grindr’s vice president of customer experience, whose role includes managing trust. and security. “We don’t want everyone’s face photo on their public profile, because many people are uncomfortable seeing their own photo publicly on the Internet in connection with an LGBTQ app,” says Hunsberger. “This is especially important for people in countries where LGBTQ people are not always accepted or where it is illegal to be part of the community.”

For larger bot scams, Hunsberger said the app uses machine learning to process metadata during registration, relies on SMS phone verification, and then tries to identify patterns of people using the app to quickly send messages. A true human resource. When users upload photos, Grindr can see if the same photo is used repeatedly on different accounts. And it encourages people to use video chat in the app as a way to avoid cat-fishing or pork scams.

Tinder’s Kozol said some of the company’s “most sophisticated work” is in machine learning, though he declined to provide details on how those tools work because bad actors could use the data to bypass the system. “As soon as someone registers, they say, ‘Is this a real person?’ We are trying to understand. And are they well-intentioned?’

Ultimately, however, AI can only do so much. People are both the scammer and the other side of the scam is the weak link, says Steinbach. “In my mind, it boils down to one message: You have to know the situation. I don’t care what the app is, you can’t just rely on the device.”


Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

2 × four =