Technology experts urge caution over WhatApp voice notes amid a rise in cybercriminals using generative artificial intelligence to clone the voices of individuals – especially high-level executives.
The uptake and uses for artificial intelligence (AI) have exploded over the past couple of years and have become a catalyst for change, introducing new ways of doing business, managing data, gathering insights, and collating content.
As an intelligent and highly capable technology, it has become a powerful tool in the business toolbox, providing fast analysis, support, and functionality.
However, it also presents a new and concerning threat as criminals find new ways to use these advancements to their advantage and have harnessed AI for malicious purposes, such as creating convincing deep fakes and perpetrating unnervingly realistic voice scams.
Using artificial intelligence tools to clone voices has introduced an entirely novel realm of risk for both companies and individuals, noted Stephen Osler, Co-Founder and Business Development Director at Nclose.
In 2019, the technology was used to impersonate the voice of the CEO of an energy company in the UK to extort $243,000 (R4.3 million). In 2021, a company in Hong Kong was defrauded of $35 million (R631 million).
These attacks are not just aimed at large corporates; individuals are also now being targeted, Osler said.
Voice clone scams, such as kidnapping hoaxes, requests for money from friends or family, and emergency calls, are all part of these scams that are proving difficult to detect.
Osler warned that WhatsApp voice notes could become a notable vulnerability for people, especially high-level executives.
“Using readily available tools online, scammers can create realistic conversations that mimic the voice of a specific individual using just a few seconds of recorded audio.
“While they have already targeted individuals making purchases on platforms like Gumtree or Bob Shop, as well as engaged in fake kidnapping scams, they are now expanding their operations to target high-level executives with C-Suite scams,” he said.
It’s easy to see the potential for cybercriminals considering the number of people who use voice notes to quickly pass on instructions to a team member or organise payments, Osler added.
Busy executives often use platforms like WhatsApp to message people while they are driving or rushing between meetings, which makes it difficult, if not impossible, for employees to recognise that the message is fake.
“An IT administrator might receive a voice note from their manager requesting a password reset for their access to O365,” noted Osler.
“Unaware of the malicious intent, the administrator complies, thinking it’s a legitimate instruction. However, in reality, they unintentionally provide privileged credentials to a threat actor. This information can then be exploited to gain unauthorised access to critical business infrastructure and potentially deploy ransomware.”
Voice notes sent via platforms like WhatsApp or Facebook Messenger, social media posts and phone calls are all fair game.
“Scammers can exploit various methods, such as recording calls with CEOs to create deep fakes or extracting voice samples from videos or posts of individuals’ online profiles,” said Osler.
Cybercriminals also have many techniques at their disposal to capture the distinctive voice identity of anyone who has shared their lives online. Subsequently, they employ AI technology to manipulate these recordings, making it appear as though the person is speaking live during the call or voice note.
“This is definitely the next level of cyber threats,” added Osler. “Deepfake technology will only become more proficient at bamboozling victims and breaching organisations”.
How to avoid becoming a victim
To defend against this, Osler suggests that organisations ensure that they have really strong processes and procedures in place that require multiple levels of authentication, specifically for financial or authentication-based transactions.
Companies should also establish a clearly defined formal process for all transactions.
“Relying solely on a voice note from the CIO or CISO should not be sufficient to change a password, authenticate a monetary transaction, or grant hackers access to the business,” he said.
“It is crucial to educate employees and end-users about the evolving risks associated with these threats. If they are aware of this type of scam, they are more likely to take a moment to verify the information before making a costly mistake”.
Moreover, Osler said people must always ensure that any voice note or instruction they receive is from a trusted source, adding that it’s important to double-check and confirm that the communication is indeed from the intended person.
“Cultivate an inquisitive mindset and question the source, whether it is a call, email, or message. By doing so, both organisations and individuals can be better prepared to identify and protect themselves against potential voice cloning scams,” he said.