AI-Powered "Hi Mum" Scams: The Growing Threat to UK Retail Consumers

It’s a simple text message that many UK mums will recognise all too well: “Hi Mum, it’s me. My phone’s broken – this is my new number.” Most parents would quickly respond, reassured by a message they’ve likely received from their child before. However, scammers are increasingly exploiting this familiarity to manipulate unsuspecting victims into sending money. And with the rise of artificial intelligence (AI), these fraudsters are becoming more convincing than ever.

According to recent data from Visa, as many as 15% of people in the UK have been targeted by these so-called ‘Hi Mum’ scams, with 13% falling victim to the fraudsters. It’s part of the growing trend of Authorised Push Payment (APP) scams, where fraudsters trick individuals into transferring money to them under false pretences. Shockingly, 54% of UK adults have been targeted by such scams, making them one of the most prevalent forms of fraud in the country.

How the Scam Works

The scam typically begins with a simple, seemingly innocent message. A fraudster will pose as a child or relative, claiming that their phone has broken and that they now have a new number. From there, they build a sense of urgency, often claiming they cannot access their bank account or need money urgently to pay rent or bills. Scammers will go to great lengths to establish trust, sometimes even asking victims to save the new number in their contacts and delete the old one, further adding to the illusion of authenticity.

In the past, these scams were relatively easy to spot, but the rise of AI technology is rapidly changing that. Fraudsters are now using AI-generated voice recordings, deepfake videos, and other digital tools to create increasingly convincing impersonations of the victim's child. With this technology, the line between real and fake has never been more blurred.

The Role of AI in Fraud

AI’s growing capabilities have made it easier than ever for scammers to create content that looks and sounds eerily realistic. In fact, 47% of people admit they would not be confident in detecting a deepfake scam involving a family member, while 65% believe AI-powered voice impersonations will make scams harder to spot.

“AI is making these scams far more sophisticated and harder to detect,” said a spokesperson from Visa. “It’s alarming how easily fraudsters can now impersonate voices and even create realistic video calls to build trust and manipulate their victims.”

Red Flags to Watch Out For

It’s more important than ever to be vigilant when receiving unexpected messages or calls from loved ones. Here are some key signs that could indicate you’re being targeted by a scam:

  • No personal sign-off: A scam message will often lack your child’s name, which is a common tell that the message is coming from someone who doesn’t know them well.

  • Unknown number: If the message is from an unfamiliar number, it’s worth questioning its legitimacy.

  • Urgency: Scammers will frequently pressure victims to act quickly, warning of dire consequences to avoid giving you time to think.

  • Strange voice or video: If you receive a voice message or video call, look out for unnatural facial expressions, odd body language, or robotic-sounding voices that don’t match the lip movements. AI-generated voices or videos often have subtle imperfections that can be clues.

Protecting Yourself

To avoid falling victim to these increasingly sophisticated scams, it’s crucial to stay alert and follow these best practices:

  1. Verify the claim: If you receive a message or call from your child claiming they need money, always verify through an independent channel – call them back on their known number, or check in with other family members.

  2. Don’t rush: Scammers want to create a sense of urgency. Take a step back and think carefully before acting on any request for money.

  3. Use multi-factor authentication: Where possible, enable multi-factor authentication (MFA) on your bank accounts to prevent fraudulent transactions.

  4. Educate others: Talk to friends and family, especially older relatives, about these scams and the role AI may play in them. Raising awareness can be one of the most effective ways to protect loved ones.

While technology has transformed the way we connect with our loved ones, it has also given fraudsters a new set of tools to exploit. As AI continues to evolve, we must be more cautious and proactive in protecting ourselves from these deceptive tactics.

For UK retailers, this is a growing concern. With many consumers now shopping online and linking payment information to their phones and apps, it’s important to stay informed and adopt fraud prevention measures to protect your customers. As AI-driven scams become more widespread, both consumers and businesses need to remain vigilant to avoid falling prey to these increasingly convincing fraud attempts.

This article was prepared by Chris Green, our Head of Financial Planning. We always appreciate your feedback. If you have enjoyed this article or have any specific topics you would like to see addressed in future newsletters, please email us at FPTeam@city-asset.co.uk.

Previous
Previous

Think Long Term to Achieve Your Goals

Next
Next

The 8th Wonder