Skip to content
Risk Fraud & Compliance

Conversational AI trained to bust scammers’ business models using scam script patterns in Australia

· 5 minute read

· 5 minute read

Researchers in Australia have developed an AI chatbot that can mimic humans in an extended phone interaction with fraudulent callers, in an innovative approach to tackling the surge in telephone scams

Using machine-learning techniques based on more than 100 scam calls, artificial intelligence algorithms have been trained to follow the scammer’s language and script patterns, with the end goal of disrupting illicit actors’ business model and making phone scams economically unviable.

The move forms part of a combined effort among the private sector, government, and law enforcement to combat scams and protect the community from its ill effects — including financial losses and emotional distress.

Record losses to scam calls

In 2022, Australian authorities reported a staggering A$3.1 billion in scam losses based on a report shared by the Australian Competition and Consumer Commission in April. The losses have increased significantly in recent years, reflecting the rising level of sophistication that scammers use in their crime operations.

Phone scams cost Australians A$141 million in losses last year, representing 29% of the number of reported scams, according to the ScamWatch component of the report. “Australia receives about six to seven scam calls per capita every month. That’s probably the number I’m personally receiving,” said Dali Kaafar, a professor and executive director at Macquarie University Cyber Security Hub in Sydney.

Kaafar addressed an audience of scam experts during last month’s Scams Summit in Sydney, where he posed the question behind the persistent and growing incidence of successful scam calls. He said there were four fundamental considerations to better understand this problem: technology, social issues, financial matters, and challenges in prevention.

Why scam calls flourish

Technology-wise, phone scams are a cybercrime activity that is easy to execute, as it only requires a Voice over Internet Protocol (VoIP) type of technology. Kaafar explained that spoofing a phone number, which masks a scammer’s identity and location, is an effortless task, especially for a skilled scammer. Moreover, it plays to the scammer’s advantage that on for many telecom companies and individuals, it is commonly challenging and expensive to authenticate a caller’s identification.

From a social standpoint, trust plays a big factor in a scam call’s success. Scammers are trained to play on human emotions such as fear or empathy, and they know how to target society’s most vulnerable sectors.

The scammers can be hostile or aggressive one minute by issuing threats — such as having the victim’s bank account or social security card suspended — to being more contrite and offering help the next minute to make the vulnerable person feel they are being looked after. Breaking free from this type of tactic requires a higher level of skepticism and emotional independence, Kaafar added.

On the financial side, phone scams are undeniably a “very lucrative” criminal activity. The millions in losses to scam phone calls last year represent the highest number of losses from all the contact methods used. It is a “high-gain, low cost” ratio for scammers, Kaafar said.

Prevention-wise, Kaafar observed that telcom companies are investing significant resources to block 100,000 to 120,000 calls a day. This is for one telcom company alone, which means that even from a technology standpoint, scam calls are a challenge to deter. Indeed, Australia’s largest telecom company, Telstra, alone reportedly spends A$1.2 million monthly on scam detection.

Introducing Apate: a multilingual scam-busting chatbot

With his team of cybersecurity experts, Kaafar created Apate, named after the Greek goddess of deception, using machine-learning techniques and natural learning processing to analyze, extract, and establish script patterns from more than 100 scam calls. They trained chatbots to develop a conversation pattern mimicking those in the actual scam calls. Using voice cloning technology, the conversational AI chatbot can adopt a certain persona, language, or emotion designed to hold scammers in long conversations. This strategy reduces actual scam engagements and reverses time wastage from the side of the victim to the scammer.

Kaafar described Apate as the antithesis of Siri or Alexa commercial AI models, in which the objective function is to minimize the duration of interactions. Apate, by contrast, hooks the scammers into “rounds and rounds of interaction without getting out of the loop” in order to optimize or maximize the interactions.

As with the nature of adaptive AI, Kaafar said that he and his team are looking to update into better models as they expand their dataset of scam phone conversations. Sharpening the ability of AI bots to convince more scammers into longer interactions is a very important objective for the team. The conversation between the AI and the chatbot can also generate intelligence that sets the groundwork for an early-stage scam alerting scheme. This may help alert organizations such as government agencies or financial institutions to warn their customers of the potential scam. “We found ourselves really looking into all these conversations with the scammers and getting a lot of intelligence out of that,” Kaafar noted.

Apate has dedicated a team of technology lawyers working on the integration of compliance, especially in relation to the personal data protection aspect of maintaining a large set of scam calls. “The early research on that is, scammers are anonymous in some way, so no personally identifiable information (PII) from them. But the other thing, which is in all jurisdictions, [is that] scam calls are not personal in nature, so you can record them,” Kaafar explained. “[But] again today, most of the data that will be collected will be really based on aggregated information.”


This article was written by Rowena Valeria Carpio, of Thomson Reuters Regulatory Intelligence.