Welcome to DU!
The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards.
Join the community:
Create a free account
Support DU (and get rid of ads!):
Become a Star Member
Latest Breaking News
General Discussion
The DU Lounge
All Forums
Issue Forums
Culture Forums
Alliance Forums
Region Forums
Support Forums
Help & Search
General Discussion
Showing Original Post only (View all)Former Biden official Nina Jankowicz: I Shouldn't Have to Accept Being in Deepfake Porn (Atlantic) [View all]
https://www.theatlantic.com/ideas/archive/2023/06/deepfake-porn-ai-misinformation/674475/Archive page at https://archive.ph/OjlIW
Recently, a Google Alert informed me that I am the subject of deepfake pornography. I wasnt shocked. For more than a year, I have been the target of a widespread online harassment campaign, and deepfake pornwhose creators, using artificial intelligence, generate explicit video clips that seem to show real people in sexual situations that never actually occurredhas become a prized weapon in the arsenal misogynists use to try to drive women out of public life. The only emotion I felt as I informed my lawyers about the latest violation of my privacy was a profound disappointment in the technologyand in the lawmakers and regulators who have offered no justice to people who appear in porn clips without their consent. Many commentators have been tying themselves in knots over the potential threats posed by artificial intelligencedeepfake videos that tip elections or start wars, job-destroying deployments of ChatGPT and other generative technologies. Yet policy makers have all but ignored an urgent AI problem that is already affecting many lives, including mine.
Last year, I resigned as head of the Department of Homeland Securitys Disinformation Governance Board, a policy-coordination body that the Biden administration let founder amid criticism mostly from the right. In subsequent months, at least three artificially generated videos that appear to show me engaging in sex acts were uploaded to websites specializing in deepfake porn. The images dont look much like me; the generative-AI models that spat them out seem to have been trained on my official U.S. government portrait, taken when I was six months pregnant. Whoever created the videos likely used a free face swap tool, essentially pasting my photo onto an existing porn video. In some moments, the original performers mouth is visible while the deepfake Frankenstein moves and my face flickers. But these videos arent meant to be convincingall of the websites and the individual videos they host are clearly labeled as fakes. Although they may provide cheap thrills for the viewer, their deeper purpose is to humiliate, shame, and objectify women, especially women who have the temerity to speak out. I am somewhat inured to this abuse, after researching and writing about it for years. But for other women, especially those in more conservative or patriarchal environments, appearing in a deepfake-porn video could be profoundly stigmatizing, even career- or life-threatening.
As if to underscore video makers compulsion to punish women who speak out, one of the videos to which Google alerted me depicts me with Hillary Clinton and Greta Thunberg. Because of their global celebrity, deepfakes of the former presidential candidate and the climate-change activist are far more numerous and more graphic than those of me. Users can also easily find deepfake-porn videos of the singer Taylor Swift, the actress Emma Watson, and the former Fox News host Megyn Kelly; Democratic officials such as Kamala Harris, Nancy Pelosi, and Alexandria Ocasio-Cortez; the Republicans Nikki Haley and Elise Stefanik; and countless other prominent women. By simply existing as women in public life, we have all become targets, stripped of our accomplishments, our intellect, and our activism and reduced to sex objects for the pleasure of millions of anonymous eyes.
Men, of course, are subject to this abuse far less frequently. In reporting this article, I searched the name Donald Trump on one prominent deepfake-porn website and turned up one video of the former presidentand three entire pages of videos depicting his wife, Melania, and daughter Ivanka. A 2019 study from Sensity, a company that monitors synthetic media, estimated that more than 96 percent of deepfakes then in existence were nonconsensual pornography of women. The reasons for this disproportion are interconnected, and are both technical and motivational: The people making these videos are presumably heterosexual men who value their own gratification more than they value womens personhood. And because AI systems are trained on an internet that abounds with images of womens bodies, much of the nonconsensual porn that those systems generate is more believable than, say, computer-generated clips of cute animals playing would be.
-snip-
Last year, I resigned as head of the Department of Homeland Securitys Disinformation Governance Board, a policy-coordination body that the Biden administration let founder amid criticism mostly from the right. In subsequent months, at least three artificially generated videos that appear to show me engaging in sex acts were uploaded to websites specializing in deepfake porn. The images dont look much like me; the generative-AI models that spat them out seem to have been trained on my official U.S. government portrait, taken when I was six months pregnant. Whoever created the videos likely used a free face swap tool, essentially pasting my photo onto an existing porn video. In some moments, the original performers mouth is visible while the deepfake Frankenstein moves and my face flickers. But these videos arent meant to be convincingall of the websites and the individual videos they host are clearly labeled as fakes. Although they may provide cheap thrills for the viewer, their deeper purpose is to humiliate, shame, and objectify women, especially women who have the temerity to speak out. I am somewhat inured to this abuse, after researching and writing about it for years. But for other women, especially those in more conservative or patriarchal environments, appearing in a deepfake-porn video could be profoundly stigmatizing, even career- or life-threatening.
As if to underscore video makers compulsion to punish women who speak out, one of the videos to which Google alerted me depicts me with Hillary Clinton and Greta Thunberg. Because of their global celebrity, deepfakes of the former presidential candidate and the climate-change activist are far more numerous and more graphic than those of me. Users can also easily find deepfake-porn videos of the singer Taylor Swift, the actress Emma Watson, and the former Fox News host Megyn Kelly; Democratic officials such as Kamala Harris, Nancy Pelosi, and Alexandria Ocasio-Cortez; the Republicans Nikki Haley and Elise Stefanik; and countless other prominent women. By simply existing as women in public life, we have all become targets, stripped of our accomplishments, our intellect, and our activism and reduced to sex objects for the pleasure of millions of anonymous eyes.
Men, of course, are subject to this abuse far less frequently. In reporting this article, I searched the name Donald Trump on one prominent deepfake-porn website and turned up one video of the former presidentand three entire pages of videos depicting his wife, Melania, and daughter Ivanka. A 2019 study from Sensity, a company that monitors synthetic media, estimated that more than 96 percent of deepfakes then in existence were nonconsensual pornography of women. The reasons for this disproportion are interconnected, and are both technical and motivational: The people making these videos are presumably heterosexual men who value their own gratification more than they value womens personhood. And because AI systems are trained on an internet that abounds with images of womens bodies, much of the nonconsensual porn that those systems generate is more believable than, say, computer-generated clips of cute animals playing would be.
-snip-
More at the link.
She concludes by saying that with AI "more powerful by the month, adapting the law to an emergent category of misogynistic abuse is all the more essential to protect womens privacy and safety. As policy makers worry whether AI will destroy the world, I beg them: Lets first stop the men who are using it to discredit and humiliate women."
InfoView thread info, including edit history
TrashPut this thread in your Trash Can (My DU » Trash Can)
BookmarkAdd this thread to your Bookmarks (My DU » Bookmarks)
22 replies, 8111 views
ShareGet links to this post and/or share on social media
AlertAlert this post for a rule violation
PowersThere are no powers you can use on this post
EditCannot edit other people's posts
ReplyReply to this post
EditCannot edit other people's posts
Rec (157)
ReplyReply to this post
22 replies
= new reply since forum marked as read
Highlight:
NoneDon't highlight anything
5 newestHighlight 5 most recent replies
Former Biden official Nina Jankowicz: I Shouldn't Have to Accept Being in Deepfake Porn (Atlantic) [View all]
highplainsdem
Jun 2023
OP
It should be illegal to use someone's likeness in something which they weren't a participant
SouthernDem4ever
Jun 2023
#3