A friend recently got duped by a scam text purporting to be from his middle daughter, and transferred £100 to an account to cover some baffling yet, according to the text, extremely time-sensitive untoward event.

You can imagine how the scammer pulled that off. Think of everyday, low-level parental anxiety, expecting bad news when kids are anywhere farther away than the kitchen table; add the sheer believability of any bad news that starts with a 19-year-old texting: “I smashed my phone”; all a scammer has to do is lean in.

Still, the story wasn’t watertight and we all called him stupid for ages afterwards, for failing to ask basic questions such as: “But if it’s your phone that’s broken, why does the money need to go into someone else’s bank account?” He didn’t even call the number to check that he could speak to her – arguably, £100 lighter was a good place to land. If ever anyone tries to relieve him of his life savings, he’ll be concentrating.

But imagine if you could hear your kid, sounding exactly like themselves, asking for money? Whose defences would be strong enough to survive voice cloning? The guys from Stop Scams UK tried to explain this to me last year: a scammer could pull a kid’s voice off their TikTok account; all they would have to do then is find the parent’s phone number. I got the wrong end of the stick and thought they had to patchwork a message out of the recorded words available on social media. Good luck with getting a believable catastrophe out of football tips and K-pop, I thought. I hadn’t considered AI for even 10 seconds, and whether it could extrapolate speech patterns from a sample, which it can.

I still think you can get round it pretty easily. Kid-machine asks for urgent assistance. You say: “Precious and perfect being, I love you with all my heart.” Kid-machine will surely reply: “I love you too.” How could it not? Real kid would claim to have been sick in its mouth. There’s no building an algorithm for this.

Zoe Williams is a Guardian columnist

Do you have an opinion on the issues raised in this article? If you would like to submit a response of up to 300 words by email to be considered for publication in our letters section, please click here.

QOSHE - Worried about AI voice scams? Luckily, I have one foolproof solution - Zoe Williams
menu_open
Columnists Actual . Favourites . Archive
We use cookies to provide some features and experiences in QOSHE

More information  .  Close
Aa Aa Aa
- A +

Worried about AI voice scams? Luckily, I have one foolproof solution

8 21
23.01.2024

A friend recently got duped by a scam text purporting to be from his middle daughter, and transferred £100 to an account to cover some baffling yet, according to the text, extremely time-sensitive untoward event.

You can imagine how the scammer pulled that off. Think of everyday, low-level parental anxiety, expecting bad news when kids are anywhere farther away than the kitchen table; add the sheer believability of any bad news that starts with a 19-year-old texting: “I smashed my phone”; all a scammer has to do is lean in.

Still, the story wasn’t watertight........

© The Guardian


Get it on Google Play