Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(49,270 posts)
Sun Jun 25, 2023, 10:18 AM Jun 2023

Former Biden official Nina Jankowicz: I Shouldn't Have to Accept Being in Deepfake Porn (Atlantic)

https://www.theatlantic.com/ideas/archive/2023/06/deepfake-porn-ai-misinformation/674475/
Archive page at https://archive.ph/OjlIW

Recently, a Google Alert informed me that I am the subject of deepfake pornography. I wasn’t shocked. For more than a year, I have been the target of a widespread online harassment campaign, and deepfake porn—whose creators, using artificial intelligence, generate explicit video clips that seem to show real people in sexual situations that never actually occurred—has become a prized weapon in the arsenal misogynists use to try to drive women out of public life. The only emotion I felt as I informed my lawyers about the latest violation of my privacy was a profound disappointment in the technology—and in the lawmakers and regulators who have offered no justice to people who appear in porn clips without their consent. Many commentators have been tying themselves in knots over the potential threats posed by artificial intelligence—deepfake videos that tip elections or start wars, job-destroying deployments of ChatGPT and other generative technologies. Yet policy makers have all but ignored an urgent AI problem that is already affecting many lives, including mine.

Last year, I resigned as head of the Department of Homeland Security’s Disinformation Governance Board, a policy-coordination body that the Biden administration let founder amid criticism mostly from the right. In subsequent months, at least three artificially generated videos that appear to show me engaging in sex acts were uploaded to websites specializing in deepfake porn. The images don’t look much like me; the generative-AI models that spat them out seem to have been trained on my official U.S. government portrait, taken when I was six months pregnant. Whoever created the videos likely used a free “face swap” tool, essentially pasting my photo onto an existing porn video. In some moments, the original performer’s mouth is visible while the deepfake Frankenstein moves and my face flickers. But these videos aren’t meant to be convincing—all of the websites and the individual videos they host are clearly labeled as fakes. Although they may provide cheap thrills for the viewer, their deeper purpose is to humiliate, shame, and objectify women, especially women who have the temerity to speak out. I am somewhat inured to this abuse, after researching and writing about it for years. But for other women, especially those in more conservative or patriarchal environments, appearing in a deepfake-porn video could be profoundly stigmatizing, even career- or life-threatening.

As if to underscore video makers’ compulsion to punish women who speak out, one of the videos to which Google alerted me depicts me with Hillary Clinton and Greta Thunberg. Because of their global celebrity, deepfakes of the former presidential candidate and the climate-change activist are far more numerous and more graphic than those of me. Users can also easily find deepfake-porn videos of the singer Taylor Swift, the actress Emma Watson, and the former Fox News host Megyn Kelly; Democratic officials such as Kamala Harris, Nancy Pelosi, and Alexandria Ocasio-Cortez; the Republicans Nikki Haley and Elise Stefanik; and countless other prominent women. By simply existing as women in public life, we have all become targets, stripped of our accomplishments, our intellect, and our activism and reduced to sex objects for the pleasure of millions of anonymous eyes.

Men, of course, are subject to this abuse far less frequently. In reporting this article, I searched the name Donald Trump on one prominent deepfake-porn website and turned up one video of the former president—and three entire pages of videos depicting his wife, Melania, and daughter Ivanka. A 2019 study from Sensity, a company that monitors synthetic media, estimated that more than 96 percent of deepfakes then in existence were nonconsensual pornography of women. The reasons for this disproportion are interconnected, and are both technical and motivational: The people making these videos are presumably heterosexual men who value their own gratification more than they value women’s personhood. And because AI systems are trained on an internet that abounds with images of women’s bodies, much of the nonconsensual porn that those systems generate is more believable than, say, computer-generated clips of cute animals playing would be.

-snip-



More at the link.

She concludes by saying that with AI "more powerful by the month, adapting the law to an emergent category of misogynistic abuse is all the more essential to protect women’s privacy and safety. As policy makers worry whether AI will destroy the world, I beg them: Let’s first stop the men who are using it to discredit and humiliate women."
22 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Former Biden official Nina Jankowicz: I Shouldn't Have to Accept Being in Deepfake Porn (Atlantic) (Original Post) highplainsdem Jun 2023 OP
Thanks for sharing this GPV Jun 2023 #1
Haley Stevens from Michigan (my rep) was on this back in 2019..... Takket Jun 2023 #2
Good for her, and thanks for the link! highplainsdem Jun 2023 #19
It should be illegal to use someone's likeness in something which they weren't a participant SouthernDem4ever Jun 2023 #3
Issue is how to you track down social media IbogaProject Jun 2023 #6
They typically invoke the broad protections of parody Orrex Jun 2023 #11
It's defamation IbogaProject Jun 2023 #15
To be clear, I'm not defending them. Orrex Jun 2023 #17
I suppose it might matter if the material is clearly represented as parody... Silent3 Jun 2023 #21
Wow 😯 There need to be laws against this and quickly. n/t iluvtennis Jun 2023 #4
This is why men make such lousy moral arbiters Warpy Jun 2023 #5
There are "revenge porn" laws but the problem is finding the maker oldsoftie Jun 2023 #7
And AI has made creating deepfakes so much easier. highplainsdem Jun 2023 #20
A 15 year old local boy took his own life MontanaMama Jun 2023 #8
God, how tragic! highplainsdem Jun 2023 #16
There's also a case in which a Pennsylvania woman created fake images of her daughter's school rival Orrex Jun 2023 #18
Kick and so recommend. As the great historian Mary Beard bronxiteforever Jun 2023 #9
Truth. JudyM Jun 2023 #22
Felony NowISeetheLight Jun 2023 #10
That toothpaste is out of the tube. There's no way to put it back. Scrivener7 Jun 2023 #12
Find the weasel that did it charge him with assault JanMichael Jun 2023 #13
their deeper purpose... 2naSalit Jun 2023 #14

Takket

(21,832 posts)
2. Haley Stevens from Michigan (my rep) was on this back in 2019.....
Sun Jun 25, 2023, 10:39 AM
Jun 2023

Last edited Sun Jun 25, 2023, 10:51 PM - Edit history (1)

She co-sponsored a bill that would have accelerated technology to detect and "call out" deep fake videos. The consequences of them are pretty dire. Besides deepfake porn a well times deepfake of, say, Joe Biden taking cash from a Chinese official could swing an election.

https://haleystevensforcongress.com/michigan-advance-stevens-introduces-bipartisan-bill-to-combat-deepfakes/

Stevens bill was passed by the House but appears to have died in Senate committee.

I think every deepfake video should have to carry a flag/disclaimer at the beginning and end that it is not real. People should also have some sort of control over their own images. It is obviously a gray area. A video created for satire (like using someone's likeness for a political cartoon), as long as it is clear it is not real should be allowed, but videos targeting a persons personal character for humiliation and outright defamation, like using them in porn, should not be allowed, and hosting platforms should be required to take them down.

Problem is how do you find them? These affected women shouldn't have to scour porn sites all day looking for their likeness so they can demand they be taken down.........

SouthernDem4ever

(6,618 posts)
3. It should be illegal to use someone's likeness in something which they weren't a participant
Sun Jun 25, 2023, 11:14 AM
Jun 2023

I don't care if it's porn or any other situation. It's not a matter of free speech, it's libelous speech. This doesn't apply to public situations depicting what really happened or satire with look-alikes, just nefarious actors who will use a likeness to purposely slander.

IbogaProject

(2,901 posts)
6. Issue is how to you track down social media
Sun Jun 25, 2023, 11:45 AM
Jun 2023

While I'm a super liberal progressive, I've been anti pornography for awhile now. I'm all for first amendment self expression. What I think should be banned is any "Assignment of Copyright", this would short circuit the predatory porn that prey's upon young often poor women who are still in their teens. Among the icky series, are Casting Couch, First Time Video, Bang Bus and many more. I only became aware when a woman became Miss Delaware, and some predatory Porn company promptly started monetizing off it. She had been fresh out of foster care and did a clip to make some money to try and get a start.

This should be easy for the courts to decide. First our trademark and copyright office can decide that the copyright on your nude likeness is automatic and must remain owned by the individual. Then basically move that any assignments are null and void and anyone selling or broadcasting them is liable for infringement. Also any payments made for those ill gotten "rights" should be deemed unrecoverable. I'm ok with an artist or actor retaining ownership, which should include an ability to rescind distribution, that could get tricky as money is invested in this filth.

Orrex

(63,375 posts)
11. They typically invoke the broad protections of parody
Sun Jun 25, 2023, 12:38 PM
Jun 2023

It's one of those know-it-when-you-see-it things, and I frankly don't trust our current crop of legislators or judges to deal with the matter competently.

IbogaProject

(2,901 posts)
15. It's defamation
Sun Jun 25, 2023, 04:00 PM
Jun 2023

I suggest an end run around this by saying the copyright to your nude likeness can't be signed away under any contract b.s. And obscenity shouldn't be a legitimate form of parody.

Orrex

(63,375 posts)
17. To be clear, I'm not defending them.
Sun Jun 25, 2023, 06:10 PM
Jun 2023

If the intent is to harm or defame, then that's indefensible and should subject the responsible parties to the full penalty of law.

However, the "nude likeness" rule would likely be struck down on first amendment grounds, because such images will be claimed to be artistic expression. Even if it stands, the people creating the images could easily "censor" the images in such a way as to insist that the images *aren't* nude (e.g, by conforming to YouTube's monetization standards).

As for the defamation, the image creators would seek to avoid charges by attaching a disclaimer a la "This image is manufactured and does not represent the real Orrex, who has not consented to the use of their image."


Again, I'm not defending the practice, but we need to be realistic about what defenses they will use. And we must also be cautious; many laws envisioned to guard against fake nudes can easily be used to protect against real but unflattering photos, or even conventional artistic renderings like editorial cartoons.



Silent3

(15,546 posts)
21. I suppose it might matter if the material is clearly represented as parody...
Sun Jun 25, 2023, 10:25 PM
Jun 2023

...foisted off as if it's real, or is some nebulous space in between.

If this kind of fake is now easy to do, it's power to cause much harm to anyone will quickly vanish. The general assumption about any seemingly revealing explicit video will soon be that it's a fake. That doesn't mean the target of such videos won't possibly feel humiliated or exploited by such fakes, but the power to cause reputational damage will be slight.

Warpy

(111,641 posts)
5. This is why men make such lousy moral arbiters
Sun Jun 25, 2023, 11:45 AM
Jun 2023

They thunder about abortion then groove on shit like this in their off time.

Don't bother to deny it. Religious patriarchs always have a predator-prey relationship with women, at best.

Laws won't do it, it will just move offshore if it's not there already.

I hope Melania has talked to Barron about this shit. You know he's found it by now.

oldsoftie

(12,769 posts)
7. There are "revenge porn" laws but the problem is finding the maker
Sun Jun 25, 2023, 12:23 PM
Jun 2023

Many years ago, even before trump, I was telling my friends that deepfakes were going to be a big problem in politics. You could make a video of a candidate saying ANYTHING and it would spread like wildfire even if it was proven fake.
Now we've got that AND this kind of shit

MontanaMama

(23,409 posts)
8. A 15 year old local boy took his own life
Sun Jun 25, 2023, 12:25 PM
Jun 2023

Last edited Sun Jun 25, 2023, 04:04 PM - Edit history (1)

after a classmate distributed his likeness in a pornographic social media post. It was snap chat so it was gone quickly but the damage was done. He didn’t think he could talk to his parents about it and guns were readily available in his house and now he’s dead.

Orrex

(63,375 posts)
18. There's also a case in which a Pennsylvania woman created fake images of her daughter's school rival
Sun Jun 25, 2023, 06:12 PM
Jun 2023

It's a big problem that will only get bigger, and our current laws seem ill-constructed to handle it, just as our legislators and judges are poorly qualified to address it.

bronxiteforever

(9,287 posts)
9. Kick and so recommend. As the great historian Mary Beard
Sun Jun 25, 2023, 12:30 PM
Jun 2023

wrote:
“For a start it doesn’t much matter what line you take as a woman, if you venture into traditional male territory, the abuse comes anyway. It is not what you say that prompts it, it’s simply the fact that you’re saying it…. When it comes to silencing women, Western culture has had thousands of years of practice.”
― Mary Beard, Women & Power: A Manifesto

JanMichael

(24,920 posts)
13. Find the weasel that did it charge him with assault
Sun Jun 25, 2023, 12:47 PM
Jun 2023

The foul little piece of s*** needs to pay.

And I say assault instead of battery because battery requires touching but assault does not.

2naSalit

(87,406 posts)
14. their deeper purpose...
Sun Jun 25, 2023, 02:58 PM
Jun 2023
their deeper purpose is to humiliate, shame, and objectify women, especially women who have the temerity to speak out.


Before all this high tech, they used to just call your boss and spread rumors about you.

Latest Discussions»General Discussion»Former Biden official Nin...