Fraudsters now use cheap AI tools to fake celebrity endorsements in convincing ads that perpetrate scams across popular platforms.
AI-generated deepfakes of Taylor Swift, Rihanna, and other celebrities are increasingly being used in scam videos circulating on TikTok, according to an investigation by Copyleaks and reporting from The Verge.
The videos rely on a familiar trick: scammers take real clips from interviews, podcasts, red carpet appearances, or talk shows, then layer in AI-cloned voices and edited visuals to make it appear that the celebrities are endorsing fake money-making offers.
In one example, the fabricated Swift clip promotes a supposed “TikTok Pay” program, while a false Rihanna video claims users can earn money simply by watching content and giving opinions.
The scam content is designed to look authentic at first glance. Many of the clips incorporate TikTok branding and familiar platform-style visuals to create a sense of legitimacy, but the links often redirect users away from the app and toward external websites that may collect personal information.
Investigators say scammers also use heavy filters, compression, and other visual effects to hide the warning signs of manipulation, including unnatural lighting, awkward facial movement, and audio that does not fully match the personality’s lips.
The issue is not limited to one platform. YouTube has removed more than 1,000 AI-generated scam ads from a single operation that used celebrity deepfakes to promote fraudulent Medicare offers, while Meta has also acknowledged facing large volumes of scam advertising across its platforms. The broader pattern suggests that deepfake fraud is becoming a platform-wide problem rather than an isolated abuse of one app or one type of content.
The software tools used to create these deepfakes have become cheaper and more convincing. Voice cloning and video-generation software can now produce polished results with little technical skill, which makes celebrity impersonation scalable and harder to spot in real time.
Experts recommend that users treat any unexpected celebrity endorsement with caution, especially if it includes a promise of easy money, requests for personal details, or a link that sends them outside the platform.


