The denial by Nigerian presidential candidate Peter Obi of a viral recording in which he appeared to refer to election campaigns as a religious war has drawn attention to the danger of AI technology in the hands of malicious actors. Using voice cloning technology, criminals can conduct more sophisticated phishing scams, fraud, and other schemes by assuming the identity of others, including exploiting emerging digital know-your-customer (KYC) methods designed to include individuals who lack the traditional identification requirements of the banking system.
“This technology has unlocked new levels of legitimacy for fraudsters,” says a Nigerian law enforcement agent who spoke to TechCabal. The most popular crimes involve fake business deals or romantic relationships that prey on the loneliness of their victims. “Using poorly mimicked accents, impersonated pictures, and fake video calls, Yahoo Boys have managed to convince Europeans to send them money, ranging from thousands of dollars to millions. The accents are usually bad, and it is astonishing how easily their victims are unable to tell, but this technology can make their lies more believable,” the agent concluded.
However, according to an anonymous scammer who spoke to TechCabal, despite being aware of the advancements in AI-based impersonation, they do not extensively use these tools yet. This is because most of these tools require pre-recorded audio, and when tricking clients, unscripted conversations work best. The scammer explained a trick called “Military Man”, where scammers pose as a white military man in love with the victim, typically a white woman. During video calls, the back camera faces another phone showing a video of a white person whose lips move as if speaking, but the video is muted, and the victim can only hear the scammer mimicking an American or European accent in the background. “Most times, the client may ask to speak to the child, usually a daughter of the man. In such instances, a pre-recorded audio file cloned in an American girl’s voice cannot have the effect that we want,” the scammer revealed. Instead, they sometimes speak themselves or hire girls who can believably speak with an American or British accent to pose as the child.
Using AI for phishing and kidnapping
Other criminals are already having success with voice impersonation technology. With just a short audio clip of a family member’s voice, often obtained from social media, and a voice cloning program, criminals can now pretend to be a loved one or a superior at work to phish for sensitive financial information or ask for money outright. After hearing her daughter crying on an abrupt phone call from a man who says he was a kidnapper, a woman was asked to wire a $1 million ransom. “It was 100% her voice. It was completely her voice; it was her inflexion and how she would have cried,” the mother says in an interview. She only found out later that it was an AI speaking from a computer and not her daughter.
Using voice cloning to exploit emerging tech for banking the unbanked
In 2022, the Southern African Fraud Prevention Service (SAFPS) reported a 264% increase in impersonation attacks during the first five months of the year, compared to 2021. While there are no current reports about the current state of affairs this quarter, experts agree that the increase in accessibility to AI technology is opening new doors to financial crime on the continent, especially with the nascent trend of financial institutions and fintech apps using voice biometrics for security and in-app activities.
For example, Stanbic IBTC’s mobile banking app allows customers to buy airtime and transfer money to saved beneficiaries using voice commands. Per its website, another bank, Standard Bank, in South Africa, enables customers to use their voice to make payments and interbank transactions. This technology, which offers inclusion to customers who have disabilities, can be exploited to steal money from people.“ The technology required to impersonate an individual [using voice cloning] has become cheaper, easier to use and more accessible. This means that it is simpler than ever before for a criminal to assume one aspect of a person’s identity,” Gur Geva, founder of remote biometric digital authentication platform iiDENTIFii, said in an email to TechCabal.
This increasing accessibility to AI tools that can be used to scam people threaten emerging biometric authentication used to drive financial inclusion. “In many countries across sub-Saharan Africa, financial institutions and startups are using voice and facial recognition technologies to onboard unbanked and underbanked customers who do not have access to traditional forms of anti-money laundering (AML)-compliant ID,” says Esigie Aguele, co-founder and CEO of digital identity technology company VerifyMe Nigeria, in an interview with TechCabal. Popular institutions that recently adopted this technology include Zimbabwean telecoms company, Econet Wireless, which offers various digital services. IdentityPass, another KYC company, says the technology is not yet prevalent but is experiencing steady growth as it has been helping several companies worldwide integrate facial recognition solutions into their verification processes.
Tosin Adisa, head of marketing at Prembly, the parent company of IdentityPass, attests that, with the right [voice cloning] tools, a malicious person can create accounts to take loans they never intend to pay back with the identity of someone else or engage in other fraudulent transactions.
“Criminals can use AI deep fake tools to exploit this emerging digital know-your-customer (eKYC) technology to create new accounts under false identities and commit financial crimes,” Aguele says.
However, the experts I spoke to are optimistic. Geva asserts that “while identity theft is growing in scale and sophistication, the tools we have at our disposal to prevent fraud are intelligent, scalable and up to the challenge.” Adisa says that companies should integrate AI technology that detects if an identity supplied for KYC is AI-generated. “Certain technologies now detect if a document’s text, or image is AI-generated. When you imbibe such systems into your existing algorithm structure technology, you can combat AI-generated audio and images,” she says in an email to TechCabal.
Aguele’s VerifyMe Nigeria also offers customer insights, and he says that fintech startups should work with KYC companies that can offer data about consumer behaviour, which can alert them to fraud. He also thinks that aside from technology, standardised regulations should be set up to make it harder or more expensive for people to spoof authentication systems using AI-generated media. “The regulations governing eKYC are not yet mature. It is necessary for there to be a KYC sector that can power open finance. Startups should work with the government to create more regulations to standardise the number and process of factor authentication required to open an account on a fintech app so that fintechs will not use the bare minimum just to get customers,” he concluded.