Tech

Dating App Breaks Romance Scammers


Michael Steinbach, global head of fraud detection at Citi and a former assistant executive director of the FBI’s National Security Branch, says that—broadly speaking—fraud has moved from “cases.” bulk card theft or just getting a lot of information very quickly to more sophisticated social engineering, where fraudsters spend more time conducting surveillance.” Dating apps are just one part of the global fraud, and bulk fraud still happens, he added. But for scammers, he says, “the reward is much greater if you can take the time to gain the trust and credibility of the victim.”

Steinbach said he advises consumers, whether on a banking app or a dating app, to approach certain interactions with a healthy level of skepticism. “We have a catchphrase here: Don’t take the call, make the call,” says Steinbach. “Most scammers, no matter how put together, contact you in an undesirable way.” Be honest with yourself; If someone seems too good to be true, they probably are. And keep conversations on the platform—in this case, on a dating app—until trust is truly established. According to the FTC, about 40% of romance scam loss reports have a “detailed narrative” (at least 2,000 characters long) that mentions moving the conversation to WhatsApp, Google Chat or Telegram.

Dating app companies have responded to the rise in scams by rolling out manual tools and AI-powered tools designed to spot potential problems. Some Match Group apps now use photo or video verification features that encourage users to take a picture of themselves directly in the app. These are then run through machine learning tools to try to determine account validity, rather than allowing someone to upload a previously taken photo that might have had its talking metadata stripped away. its. (A WIRED report on scam dating app from october 2022 points out that at the time, Hinge didn’t have this verification feature, although Tinder did.)

For an app like Grindr, which primarily serves men in the LGBTQ community, the tension between privacy and safety is greater than with other apps, said Alice Hunsberger, vice president of customer experience at Grindr. and has the role of trust monitoring said. and safe. “We don’t require photos of people’s faces on their public profiles because many people don’t feel comfortable publishing their photos on the internet linked to an LGBTQ app,” says Hunsberger. “This is especially important for people in countries that don’t always accept LGBTQ people or where it’s illegal to even be part of a community.”

Hunsberger says that for large-scale bot scams, the app uses machine learning to process metadata at the time of registration, relies on SMS phone verification, and then tries to detect user patterns. application application to send messages faster than a real human power. When users upload photos, Grindr can detect when the same photo has been used multiple times across different accounts. And it encourages people to use video chat within the app itself as a way to avoid scams catching catfish or butchering pigs.

Kozoll, from Tinder, says that some of the company’s “most sophisticated work” is in machine learning, though he declined to share details on how those tools work, because bad actors could use it. use information to circumvent the system. “As soon as someone signs up, we try to understand, ‘Is this a real person? And are they people with good intentions?’”

In the end, however, AI will only do so much. Humans are both the fraudster and the weak link on the other side of the scam, says Steinbach. “In my mind, it boils down to one message: You have to be aware of the situation. I don’t care what app it is, you can’t just rely on the tool itself.”

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button