
Law enforcement bodies in Malaysia are confronting a surge in AI-driven investment scams in which criminal groups deploy deepfake videos and synthetic voiceovers to impersonate senior political figures and members of royalty to solicit funds from the public. According to complaints reviewed by enforcement-linked organisations, the scams are disseminated primarily via Facebook advertisements that falsely claim endorsement by high-profile
leaders and, in several cases, reference established domestic banks to enhance credibility. Victims are directed into private messaging channels, where they are instructed to transfer funds into personal bank accounts controlled by scammers. Initial deposits commonly begin at approximately RM1,000 before escalating into six-figure losses, with confirmed individual cases exceeding RM100,000.
The Malaysian Chinese Association’s Public Services and Complaints Department has identified a renewed concentration of such cases, noting that the impersonation materials are generated using AI video synthesis and voice cloning to closely replicate real public figures. Once victims engage, scam operators present fabricated investment dashboards, falsified transaction confirmations and staged profit records to prolong the scheme. Withdrawals are systematically blocked using claims of compliance checks or additional fees, a structure consistent with organized investment fraud operations previously linked to broker impersonation and cloned trading platforms.
Regulatory intervention and enforcement pressure
Regulators have moved to suppress the distribution channels rather than individual cases. The Malaysian Communications and Multimedia Commission has taken enforcement action against scam advertisements and impersonation videos, though authorities acknowledge that the networks reappear quickly using new accounts and recycled AI assets. Investigators assess that many of the receiving bank accounts function as short-term mule accounts, abandoned after rapid fund extraction, complicating recovery efforts.
Secondary exploitation of victims
Authorities have also flagged a secondary fraud risk, where victims are contacted by fake “recovery agents” or unlicensed legal representatives offering fund retrieval services for upfront fees. Enforcement bodies stress that such services are not legally advertised by licensed lawyers and represent an extension of the original fraud cycle.
Investigations remain ongoing as complaints continue to accumulate, with enforcement agencies warning that AI impersonation has materially reduced detection time and increased victim conversion rates.