Technology

Audio Forgery with Artificial Intelligence: Tips to Protect Against Fraud

Audio Forgery with Artificial Intelligence: Tips to Protect Against Fraud

You must be very cautious when receiving emergency phone calls, especially those that request money transfers or require you to provide sensitive information, even if the voice resembles that of your relatives or friends. In the age of artificial intelligence, forgery surrounds people in all aspects of their lives. Recently, a type of fraud via AI applications, known as "forgery fraud" or "voice manipulation," has spread, but you can avoid it if you pay attention to these developments and follow the advice provided by cybersecurity and information technology experts to Sky News Arabia.

Last March, OpenAI, a company specializing in generative artificial intelligence and the creator of the famous chatbot program ChatGPT, introduced a voice cloning tool that will have limited use to prevent fraud. The tool, named "Voice Engine," is capable of reproducing a person's voice from a 15-second audio sample, according to a statement by OpenAI that discussed the results of a small-scale test. The statement added, "We recognize that the ability to generate human-like voices poses significant risks, especially in this election year," and therefore, "We are working with American and international partners from governments, media, entertainment, education, civil society, and other sectors, and we take their feedback into consideration."

According to cybersecurity and information technology expert Mohamed Khalef, it has become "very easy" to use AI to make phone calls with voices that sound familiar to individuals, which is called "voice forgery" or "voice manipulation." Regarding how this fake voice is created, Khalef explains that it is done using AI programs to mimic known voices or voices taken from social media. A dangerous use of this forgery includes defrauding victims with urgent financial requests by misleading them into believing that the caller is someone they know, as Khalef points out, warning that these uses could undermine people's trust in the digital world, negatively affecting the pace of digital transformation.

Cybersecurity and information technology expert Tamer Mohamed urges caution in the calls we receive, stating that "AI-generated phone calls have become widespread, and there are applications that execute them without the need for the scammer to have expertise." It is worth noting that fraudulent activities have reached prominent figures worldwide, including an advisor working on the presidential campaign for one of President Joe Biden's Democratic rivals, who created a program during the presidential primary elections that impersonated Biden and used the fake voice to urge voters not to participate in the New Hampshire elections. Voices of celebrities like English actor Stephen Fry and London Mayor Sadiq Khan have been cloned, attributing controversial statements to them. Additionally, an unnamed executive fell victim to a scam that caused him to transfer $243,000 to the fraudster after receiving a fake phone call, according to the British newspaper Daily Mail.

**How Can We Protect Ourselves from Fraud?**

To face this new threat, Mohamed Khalef and Tamer Mohamed offer these tips when receiving phone calls or listening to voices without seeing their owners:

- Always be skeptical about the authenticity of the call, even if the voice is familiar, especially when the situation seems urgent.

- Try to contact the person who is supposedly the caller via a different communication method that you can verify.

- Be cautious about providing personal information or financial data during a call you did not initiate yourself.

- Do not click on links to unknown accounts or websites, as these could compromise your phone, enabling the use of recorded audio to recognize your voice.

- Do not agree to the terms of any program without ensuring you understand it and are familiar with the application.

Our readers are reading too