Facebook and Instagram allowed the use of blurry deepfake nude photos of actress Jenna Ortega in her teenage years to promote an app that lets users create fake, explicit photos of anyone with artificial intelligence. The “Wednesday” star was targeted by the Perky AI app, which advertised itself as a means of creating sexually graphic images of anyone using AI.
According to NBC News, at least 11 advertisements featuring a blurry image of Ortega appearing to be topless at 16 years old appeared on the two platforms last month for the Perky AI app, which has since been removed from the Apple app store. However, the star’s images were fake.
Deepfake Targets Jenna Ortega
The $7.99-per-week app demonstrated to users how to use prompts like “no clothes,” “latex costume,” and “Batman underwear” to produce realistic-looking faux nudes of actual people. Users of the app can “enter a prompt to make them look and be dressed as you wish,” according to the description seen on the Apple store.
Perky AI claimed to be able to generate “NSFW,” or “not safe for work,” images at the request of users.
Since September, the AI-powered app has run more than 260 distinct ads on Meta. However, the social media giant removed 30 of those ads due to their violation of its terms.
The 21-year-old Ortega’s advertisement received over 2,600 views, according to NBC.
The parent company of both social media platforms, Meta, suspended the Perky app’s page in response to an inquiry from NBC News.
Another Perky advertisement included a cropped and distorted photo of vocalist Sabrina Carpenter, accompanied by the same claim that artificial intelligence was used to show her nude, according to NBC News.
In Bad Taste
With over $131 billion in revenue in 2023, Meta’s advertising accounts for 95 percent of its total revenue. On its website, Perky AI listed RichAds as its developer. RichAds is a “global self-serve ad network” that produces push adverts and is headquartered in Cyprus.
Meta said in a statement that the social media company “strictly prohibits child nudity, content that sexualizes children, and services offering AI-generated non-consensual nude images.”
The unsettling advertisements are a part of an increasing issue with deepfakes of girls and women created by AI that appear online. Among the worst examples have been digitally produced child pornography and phony revenge porn from unhappy ex-partners.
This is the most recent scandal involving the online distribution of deepfakes of famous people. It comes months after Taylor Swift’s obscene photos created by AI went viral.
The AI photos that were making the rounds online made Swift “furious,” and she was thinking of taking the nasty deepfake porn site to court for posting them.
The musician was the website’s most recent victim that flouts state porn laws and continues to outrun cybercrime squads.
Dozens of explicit photos of Swift, featuring her in a sequence of sexy poses in the stadium while wearing Kansas City Chiefs memorabilia, were posted on Celeb Jihad.
The pornography was viewed 47 million times and is still available before it was taken down.