$25million deepfake scam exposes new threat: How Ghanaian businesses can avoid losing millions to AI fraud
$25million deepfake scam exposes new threat: How Ghanaian businesses can avoid losing millions to AI fraud
Featured

$25million deepfake scam exposes new threat: How Ghanaian businesses can avoid losing millions to AI fraud

Cyber criminals are increasingly deploying generative artificial intelligence to execute sophisticated fraud schemes, and Ghanaian businesses must urgently strengthen their payment verification processes or risk becoming the next victims, a leading fraud expert has warned.

In a detailed advisory issued by Wells Fargo's Fraud Education and Awareness Program, Anil G. Khilnani revealed that 79 per cent of organisations worldwide reported being victims of attempted or actual payments fraud in 2024—a sharp rise from 65 per cent just two years earlier.

The warning comes as Ghana's own digital payments landscape expands rapidly, with mobile money transactions and online banking becoming the norm for businesses and individuals alike. The same technologies that make transactions convenient also create new vulnerabilities that fraudsters are exploiting with alarming sophistication.

The $25 Million Deepfake Call That Shocked the World

One recent scam made international headlines when fraudsters netted more than $25 million from a company in Hong Kong. A finance employee authorised a series of payments to the criminals after believing the attendees on a corporate video call were actual co-workers—including the company's chief financial officer. These "participants" turned out to be high-tech video deepfakes so realistic that the employee was unable to detect the scam.

According to Khilnani, situations like these are becoming increasingly common. Gen AI capabilities make it faster and easier for bad actors to churn out phishing emails, deceptive text messages, calls, and videos—all with much higher quality than in the past. The technology also enables scammers to automate and scale up their attempts so they can target more companies, more often.

How Fraudsters Are Using AI

The advisory identified three primary ways criminals are deploying generative AI. First, they are creating realistic phishing emails and fake websites. The days of checking for bad spelling, poor grammar, and sloppy logos are over; today's Gen AI versions are highly sophisticated and difficult to distinguish from legitimate communications.

Second, fraudsters are impersonating trusted sources with deepfake calls, images, and videos. These Gen AI elements can sound or look like someone an employee knows, and it takes only a few video clips or digital soundbites to spoof an actual person. Scammers recognise that employees are more likely to process a fraudulent payment if they think the voice on the other end of the line is their boss, a colleague, or the CFO.

Third, criminals are building connections through complex storytelling. Automated chatbots and other Gen AI tools help fraudsters build trust with a target. They can quickly gather information from social media and other sources, conduct detailed conversations via multiple communication channels, and use specific details to manipulate an employee.

Why Banks Cannot Always Help

The advisory noted that schemes like phishing and business email compromise can make it nearly impossible for banks or payment providers to detect unauthorised payments. When an employee authorises a payment or unintentionally shares bank account details, login credentials, or other sensitive personal data with scammers, the resulting transactions can look just like regular payments.

This makes ongoing employee training a critical element of any company's fraud prevention strategy.

Protecting Your Organisation

The advisory outlined several best practices to help organisations defend against fraudsters. First, businesses should strengthen their email filtering technology by implementing an AI-based secure email gateway. These solutions use behavioural analysis and other techniques to filter out imposter domains and business email compromise attempts.

Second, companies must implement robust payment-verification processes. If a vendor emails to request a payment instruction change, staff should not hit "reply"—doing so would respond directly to the imposter if the email is fraudulent. Instead, they should call the known contact using the phone number already in the system.

If a business leader contacts an employee with an unusual request, the employee should follow up using a different method. Any payment that the employee is asked to keep secret, send to a new country or payee, or that must be done with extreme urgency should be treated as a red flag for scams.

Third, businesses should use dual custody for payments and user administration. Having two employees involved provides a second opportunity to catch fraudsters. One employee should initiate a payment or change, while a separate employee—using a different computer—approves the transaction or request. The same checks and balances should apply to designating user access and entitlements for critical internal systems and bank portals.

Fourth, ongoing fraud prevention training is vital. Companies must create a culture of awareness and cybersecurity, giving extra attention to anyone involved in money movement and vendor management. Security and verification should be emphasised over immediacy; taking even a few more minutes to confirm that a payment-related communication is authentic can help prevent significant loss.

The Bottom Line

As AI continues to disrupt the way people live and work, businesses that stay current with the technology's capabilities will have the best chance of understanding its potential, using AI wisely, and protecting their organisations from fraud and risk. For Ghanaian businesses, where digital payments are increasingly central to commerce, the message is clear: verify everything, trust nothing, and train everyone.


Our newsletter gives you access to a curated selection of the most important stories daily. Don't miss out. Subscribe Now.

Connect With Us : 0242202447 | 0551484843 | 0266361755 | 059 199 7513 |