YouTube has deleted more than 1,000 deepfake scam ad videos of celebrities from its platform.
YouTube said it is “investing heavily” to stop AI celebrity scam ads.
After a 404 Media probe into such fake celebrity ads, YouTube deleted more than 1,000 videos tied to an advertising ring that used AI to make celebrities like Taylor Swift, Steve Harvey, and Joe Rogan promote Medicare scams.
Such videos had nearly 200 million views, with both users and celebrities regularly complaining about them, said the report.
YouTube is “aware” that its platform is being used with AI-generated ads of celebrities, and is working hard to stop such celebrity deepfakes.
The YouTube action came as non-consensual deepfake porn of Taylor Swift went viral on X, with one post garnering more than 45 million views and 24,000 reposts before it was removed.
The post was live on the platform for around 17 hours before its removal.
A report from 404 Media found that the images may have originated in a group on Telegram, where users share explicit AI-generated images of women.
Users in the group also reportedly joked about how the images of Swift went viral on X.
According to the latest research from cybersecurity firm Deeptrace, about 96 percent of deepfakes are pornographic, and they almost always portray women.
Bijay Pokharel
Related posts
Recent Posts
Subscribe
Cybersecurity Newsletter
You have Successfully Subscribed!
Sign up for cybersecurity newsletter and get latest news updates delivered straight to your inbox. You are also consenting to our Privacy Policy and Terms of Use.