Skip to content

Online romance scammers may have a new wingman — artificial intelligence

AI can be harnessed to mimic voices and faces
31840265_web1_20230210130236-bf65c6b95435a95d9ba070aabcb946000f10a89fafa279f18308d304eb6c9782
The voice you hear on the other end of your phone call may not be who you think it is, the person you’re texting with may really be a bot and the face in a photo or video on your favourite dating app may not even exist. Technological advancements in artificial intelligence create the potential to fuel romance scams, said Jeff Clune, an associate professor of computer science at the University of British Columbia. A young person uses a smart phone in Chicago. THE CANADIAN PRESS/AP

The voice you hear on the other end of your phone call may not be who you think it is, the person you’re texting with may really be a bot and the face in a photo or video on your favourite dating app may not even exist.

Technological advancements in artificial intelligence create the potential to fuel romance scams, said Jeff Clune, an associate professor of computer science at the University of British Columbia.

Scammers now have “more tools in their tool box to hoodwink people, especially people who are not aware of recent advances in technology,” he said in an interview.

Such advancements include voice simulators, face generators, deepfakes, in which an existing image or video is used to create fake but believable video footage, and chat bots like Chat GPT that generate humanlike text responses.

The Canadian Antifraud Centre has reported romance scams skyrocketed during the mass online shift caused by the COVID-19 pandemic. It said the fraud schemes often involve convincing a victim to enter a virtual, online relationship and once trust and affection is developed, swindlers use that emotional leverage to request money, cryptocurrency, gifts or investments.

The centre has warned that Valentine’s Day provides an “opportunity for fraudsters to target Canadians looking for a relationship.” Its latest available data revealed 1,928 reports of romance scams totalling more than $64.5 million in losses in 2021, a nearly 25 per cent jump from the year before.

Its Cyber Threat Assessment for 2023/2024 flagged convincing deepfake technology and artificial intelligence, or AI, text generators as potential “threat actors.”

“As deepfakes become harder to distinguish from genuine content and the tools to create convincing deepfakes become more widely available, cyber threat actors will very likely further incorporate the technology into their use of (misinformation, disinformation, and malinformation) campaigns, allowing them to increase the scope, scale, and believability of influence activities,” the analysis said.

“Text generators have progressed to a point where the content they produce is often nearly indecipherable from legitimate material.”

Clune said that scams utilizing aspects of AI technology still require a person pulling the strings, but that could soon change.

“Even though scamming is very prevalent right now, there’s still a cost to do it because a human has to sit there and spend their time, but if you can have AI do it to a million people a day and just sit and watch the money roll in, that’s a scary place to be — and that is something that is possible with this technology,” he said.

Suzie Dunn, an assistant professor at the Schulich School of Law at Dalhousie University, said the law has not kept up with technology, leaving “major gaps” in the legal framework.

“One of the challenges that we have around impersonation laws is that, under the Criminal Code of Canada, you actually have to be impersonating an existing person,” Dunn said in an interview.

She said software that allows people to create a non-existent individual, with a fake accent, voice or face, poses legal complications.

“If you’re using someone’s images or using someone’s name, then it can be counted as a form of impersonation, but with these new technologies, where you can actually create a non-existent person, the types of harms that are often meant to be covered under these impersonation rules aren’t really covered.”

Victims must rely on existing extortion and fraud laws, she said.

“We don’t need new extortion laws. Extortion is extortion whether it’s being done by deepfakes or by a regular person,” she added.

“There’s also a major gap there in what role the platforms play in addressing the harms that occur on them.”

Dunn said corporations, including AI developers, dating and social media platforms, should be aware of the potential harms and put the necessary safeguards in place.

Clune agreed. He said new technology “will always be out in front of the laws and the politicians.”

He said the pace of progress in the field is “breathtaking,” and it will continue, if not accelerate.

“Almost anything you can imagine that seems science fiction and futuristic today will be around in a handful of years. It is worth politicians and society engaging in thoughtful conversations about what’s coming and trying to get ahead of it and think through what can we do about it.”

Brieanna Charlebois, The Canadian Press

READ ALSO: Online ‘romance scam’ sees Abbotsford senior lose $270K

Like us on Facebook and follow us on Twitter.