Tech

Scammers are using AI voice generators to sound like your loved ones. Here’s what to watch for


Robot knocks phone

Kilito Chan/Getty’s photo

Imagine you receive a phone call telling you that your loved one is in trouble. At that point, your instinct is most likely to do anything to get them out of harm’s way, including transferring money.

Scammers is aware of this Achilles heel and is currently using AI to exploit it.

A report from washington articles tells the story of an elderly couple, Ruth and Greg Card, who have become the victims of an impersonation phone scam.

Also: These experts are racing to protect AI from hackers. Time is running out

Ruth, 73, received a phone call from someone she thought was her nephew. He tells her that she is in prison, has no wallet or cell phone and needs money urgently. Like any other anxious grandparent, Ruth and her husband (75 years old) rushed to the bank to get the money.

Only after going to the second bank did the bank manager warn them that they’d had a similar case before and ended up being a scam — and this was likely a scam as well. .

This scam is not an isolated incident. The report indicates that by 2022, impersonation scams are the second most common scam in the US, with more than 36,000 people falling victim to calls impersonating their friends and family. Of those scams, 5,100 occurred over the phone, robbing people of more than $11 million, according to FTC officials.

Also: Best AI chatbots: ChatGPT and other alternatives to try

Artificial intelligence has recently gained a lot of buzz because of the growing popularity of generalist AI programs, such as OpenAI’s ChatGPT And DALL-E. These programs are mainly associated with their advanced capabilities can increase productivity between users.

However, the same techniques used to train those useful language models can be used to train more harmful programs, such as AI speech generators.

These programs analyze a person’s voice to find the different patterns that make up that person’s unique sound, such as pitch and stress, and then recreate that voice. Many of these tools work within seconds and can produce sounds that are virtually indistinguishable from the source.

Also: The looming horror of AI voice transcription

What can you do to protect yourself?

So what can you do to prevent yourself from falling into a scam? The first step is to be aware that this type of call is a possibility.

If you receive a distress call from one of your loved ones, remember that it is most likely a robot speaking instead. To make sure it’s really a loved one, try to verify the source.

Try asking the caller a personal question that only your loved one knows the answer to. This can be as simple as asking for your pet’s name, family member, or other personal information.

You can also check a loved one’s location to see if it matches where they say it is. It’s common these days to share your location with friends and family, and in this case it can be extremely helpful.

You can also try calling or texting your loved one from another phone to verify the identity of the caller. If your loved one picks up the phone or texts back and doesn’t know what you’re talking about, you’ve got your answer.

Finally, before making any major monetary decisions, consider first contacting the authorities to get some guidance on how best to proceed.

news7g

News7g: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, Sports...at the World everyday world. Hot news, images, video clips that are updated quickly and reliably

Related Articles

Back to top button