2.4 C
United Kingdom
Friday, November 22, 2024

Megan Thee Stallion was targeted by a sexually explicit deepfake. It’s a huge problem.


Megan Thee Stallion is the latest celebrity to be targeted by a sexually explicit deepfake that was made without her consent — highlighting how pervasive this form of abuse is becoming. 

Deepfakes are videos that often use AI to effectively superimpose someone’s face onto a different body — real or invented — making it appear like they’re doing the actions of the person in the clip. They can include everything from clips that make it seem like a politician is giving an interview they never did to nonconsensual sexually explicit videos that swap in people’s faces. 

The latter has become an increasingly common type of sexual abuse, with new apps emerging that enable people to create clips of others they know. As Cleo Abram noted in a Vox video back in 2020, “The most urgent threat of deepfakes isn’t politics. It’s porn.”

In the last week, a deepfake of Megan Thee Stallion featuring sexually explicit images has circulated on X. According to NBC News, the video has garnered tens of thousands of views and been posted by multiple accounts. A spokesperson from X also told the outlet it’s now “proactively removing this content,” since its rules “prohibit the sharing of non-consensual intimate media.”

“It’s really sick how yall go out of the way to hurt me when you see me winning,” Megan Thee Stallion wrote in a statement on X on Saturday. “Yall going too far, Fake ass shit.”

Deepfakes are incredibly violating to those targeted by them and difficult to remedy since the damage is already done even if they’re taken down. Following the X post on Saturday, Megan Thee Stallion was seen getting emotional at a concert and crying while singing “Cobra,” a song that touches on issues related to mental health. (She did not acknowledge the subject at the show.)

Megan Thee Stallion is among a growing list of prominent women who’ve been subject to an offense like this —and who are speaking out against it. Her experience emphasizes the scope of the problem, and the potential it has to harm even more people as the tools that enable it become more available. 

Deepfakes are a growing form of abuse

Megan Thee Stallion’s experience points to how deepfakes have been weaponized in the last few years, including against other celebrities such as Taylor Swift, as well as private individuals. As cybersecurity firm DeepTrace found in 2019, 96 percent of deepfake videos on the internet were pornographic and pretty much 100 percent of these were of women. 

“Deepfake sex videos say to individuals that their bodies are not their own and can make it difficult to stay online, get or keep a job, and feel safe,” Danielle Citron, a Boston University law professor, said in the DeepTrace report. 

These videos are not only traumatic when they emerge, but the impacts can follow women around and affect their reputation and mental health for years. Such abuse, much like so-called revenge porn — a form of abuse that includes posting nude images of women without their consent — is degrading and aimed at taking away their power. “Deepfake sexual abuse is commonly about trying to silence women who speak out,” Clare McGlynn, a law professor at Durham University in the UK, told Glamour

Before this, Megan Thee Stallion had already condemned the actions of rapper Tory Lanez, who was convicted of shooting her in the foot. She’s also borne the brunt of numerous attacks from men questioning her story and undermining her experiences in the years since. 

As Vox’s Anna North has reported, the prevalence of such deepfakes is only expected to grow as AI technology has become more common and easy to use. In some cases, mobile apps have even allowed high school students to make nonconsensual sexually explicit deepfake images of their classmates. Creators online also now offer custom deepfakes for people looking to create these videos and images of either famous stars or individuals they know. 

A concerning aspect of these acts is the limited recourse people have to combat them. In Megan Thee Stallion’s case, X has been active in taking videos down, though that hasn’t always been its approach even with other prominent figures, NBC News reports. Additionally, North writes, federal efforts to pass a law barring such deepfakes are still in progress, and more accountability of tech companies is much needed to truly combat this problem. 



Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles