Connect with us
Photo by Grant Davies on Unsplash

Software

WhatsApp voice notes put you at risk

Only 60 seconds of talking is needed to clone someone’s voice, and use it to impersonate them.

Using artificial intelligence tools to clone voices has introduced an entirely novel realm of risk for both companies and individuals.

Generative artificial intelligence (AI) has become a catalyst for change, introducing new ways of doing business, managing data, gathering insights, and collating content. As an intelligent and highly capable technology, it has become a powerful tool in the business toolbox, providing fast analysis, support, and functionality.

However, the immense potential of generative AI is being exploited by cybercriminals, who have harnessed it for malicious purposes, such as creating convincing deep fakes and perpetrating unnervingly realistic voice scams. In 2019, the technology was used to impersonate the voice of the CEO of an energy company in the UK to extort $243,000. In 2021, a company in Hong Kong was defrauded of $35-million. These attacks are not just aimed at large corporates; individuals are also being targeted. Voice clone scams, such as kidnapping hoaxes, requests for money from friends or family, and emergency calls, are all part of these scams that are proving difficult to detect.

“The scammers are incredibly clever,” says Stephen Osler, co-founder and business development director at Nclose. “Using readily available tools online, scammers can create realistic conversations that mimic the voice of a specific individual using just a few seconds of recorded audio. While they have already targeted individuals making purchases on platforms like Gumtree or Bob Shop, as well as engaged in fake kidnapping scams, they are now expanding their operations to target high-level executives with C-Suite scams.”

It’s easy to see the potential for cybercriminals considering the number of people who use voice notes to quickly pass on instructions to a team member or organise payments. Busy executives often use platforms like WhatsApp to message people while they are driving or rushing between meetings, which makes it difficult, if not impossible, for employees to recognise that the message is fake.

“An IT administrator might receive a voice note from their manager, requesting a password reset for their access to O365,” says Osler. “Unaware of the malicious intent, the administrator complies, thinking it’s a legitimate instruction. However, in reality, they unintentionally provide privileged credentials to a threat actor. This information can then be exploited to gain unauthorised access to critical business infrastructure and potentially deploy ransomware.”

And where are these voice clips coming from? Well, from voice notes sent via platforms like WhatsApp or Facebook Messenger, social media posts and phone calls. Scammers can exploit various methods, such as recording calls with CEOs, to create deep fakes, or extracting voice samples from videos or posts of individuals’ online profiles. 

Cybercriminals have many techniques at their disposal to capture the distinctive voice identity of anyone who has shared their lives online. Subsequently, they employ AI technology to manipulate these recordings, making it appear as though the person is speaking live during the call or voice note.

“This is definitely the next level of cyber threats,” says Osler. “Deepfake technology will only become more proficient at bamboozling victims and breaching organisations. To defend against this, organisations must ensure that they have really strong processes and procedures in place that require multiple levels of authentication, specifically for financial or authentication-based transactions.”

Companies should establish a clearly defined formal process for all transactions. Relying solely on a voice note from the CIO or CISO should not be sufficient to change a password, authenticate a monetary transaction, or grant hackers access to the business. It is crucial to educate employees and end-users about the evolving risks associated with these threats. If they are aware of this type of scam, they are more likely to take a moment to verify the information before making a costly mistake.

“Always ensure that any voice note or instruction you receive is from a trusted source. It is important to double-check and confirm that the communication is indeed from the intended person,” says Osler. “Cultivate an inquisitive mindset and question the source, whether it is a call, email, or message. By doing so, both organisations and individuals can be better prepared to identify and protect themselves against potential voice cloning scams.”

Subscribe to our free newsletter
To Top