IT Brief Canada - Technology news for CIOs & IT decision-makers
Australian man lit by laptop catfishing pixel woman stealing wallet

AI-fuelled romance scams surge with deepfake deception

Thu, 12th Feb 2026

KnowBe4 has warned consumers about a rise in romance scams that use generative AI and deepfake tools to create convincing online personas and sustain long-running deception.

The warning comes as the Australian Federal Police reports more than 5,000 Australians have already received text alerts about romance scams, with activity expected to increase around Valentine's Day.

Security specialists say the techniques used in romance fraud have changed what many people consider basic checks. Poor grammar, inconsistent messaging, and an inability to share a specific photograph once served as red flags. Now, AI tools can produce fluent messages at scale and generate tailored images on demand.

Roger Grimes, CISO Advisor at KnowBe4, said the latest wave of romance fraud is highly automated and more convincing on first contact. Scammers are moving beyond stolen images and basic scripts, he said, using “entire fake identities” and “automated conversation bots that build deep emotional trust over months.”

Deepfakes on calls

Real-time video calls are now common in online dating and long-distance relationships, and they have become a new arena for deception. Grimes said live face-swapping and AI voice synthesis can be used on common video platforms, reducing the value of video as proof of identity.

The warning reflects a broader shift in cybercrime, with generative AI increasingly used to strengthen social engineering. In romance scams, that allows a fraudster to maintain near-continuous contact, present a polished personal story, and adapt quickly when challenged.

Grimes also pointed to what he called the “Death of the Photo Test.” He said scammers can generate images that match a victim's request, such as holding a specific newspaper or appearing in a particular location, undermining photo requests as a verification step.

Celebrity impersonation

Impersonation remains central to romance fraud, including scams where criminals pose as celebrities. Grimes said emotional investment can become so strong that some victims keep sending money even after being shown evidence the relationship is not real.

He said the pattern can involve large sums and severe personal consequences, including borrowing and family conflict. “I've proven beyond a shadow of a doubt that the person the victim is communicating with is not who they claim to be, and never has that resulted in the victim stopping,” Grimes said.

Grimes cited cases that show how victims can remain engaged even as doubts emerge. “Once they are hooked, it's a powerful drug. They don't give up until the house is gone, the friendships are gone, and the money is exhausted. One victim told me, 'I know he's fake, but he's the only one telling me he loves me. I'll pay to hear that,'” he said.

Who gets targeted

Romance scammers often tailor personas to match a target's expectations and perceived vulnerabilities. Grimes said older people may be singled out because they may have more savings and may also face loneliness or isolation. He described common storylines used against different groups, with men and women approached using different profiles and emotional hooks.

When targeting women, he said, scammers may pose as successful widowers who travel for work and seem reluctant about love. When targeting men, they may present as younger women in low-paid jobs who frame the relationship around rescue and financial support.

Financial loss remains one of the defining outcomes. Grimes said that by the time families intervene, the average victim who has contacted him has often lost more than USD $250,000. Some victims continue paying even after accepting the relationship is fraudulent, he added.

Money request test

As synthetic media improves, security advisers are placing more weight on behavioural signals and transaction patterns than on appearance or fluency. Grimes said requests for money should be treated as the key warning sign, no matter how convincing someone appears in text, images, or video.

He said scammers often avoid asking directly for bank or card details because it can trigger suspicion. Instead, they steer victims toward transfers and payment methods that are harder to recover, including gift cards and cryptocurrency. “It's the same old red flag, although scammers rarely ask for personal information like credit cards or banking information as it might cause the victim to be too suspicious,” he said.

“Instead they get the victim to send them money, often having to educate the victims on how to quickly send money internationally, like by using department store gift cards or cryptocurrency. Any requests for money are highly suspect. If someone you are romancing asks you for money-whether it's for a plane ticket, a medical emergency, or a business investment-there is a 99.99% chance it is a scam,” Grimes added.

The Australian Federal Police and cybersecurity advisers continue to urge the public to be cautious with unsolicited contact, verify identities through trusted channels, and avoid sending money to people known only online as AI-generated content and deepfake tools become easier to access.