Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News Editorials & Other Articles General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

highplainsdem

(61,078 posts)
Tue Feb 17, 2026, 11:50 AM 21 hrs ago

AI Delusions Are Leading to Domestic Abuse, Harassment, and Stalking

Source: Futurism

-snip-

For months, her then-fiancé and partner of several years had been fixating on her and their relationship with OpenAI’s ChatGPT. In mid-2024, she explained, they’d hit a rough patch as a couple; in response, he turned to ChatGPT, which he’d previously used for general business-related tasks, for “therapy.”

Before she knew it, she recalled, he was spending hours each day talking with the bot, funneling everything she said or did into the model and propounding on pseudo-psychiatric theories about her mental health and behavior. He started to bombard the woman with screenshots of his ChatGPT interactions and copy-pasted AI-generated text, in which the chatbot can be seen armchair-diagnosing her with personality disorders and insisting that she was concealing her real feelings and behavior through coded language. The bot often laced its so-called analyses with flowery spiritual jargon, accusing the woman of engaging in manipulative “rituals.”

-snip-

In some videos, he stares into the camera, reading from seemingly AI-generated scripts; others feature ChatGPT-generated text overlaid on spiritual or sci-fi-esque graphics. In multiple posts, he describes stabbing the woman. In another, he discusses surveilling her. (The posts, which we’ve reviewed, are intensely disturbing; we’re not quoting directly from them or the man’s ChatGPT transcripts due to concern for the woman’s privacy and safety.)

-snip-

We’ve identified at least ten cases in which chatbots, primarily ChatGPT, fed a user’s fixation on another real person — fueling the false idea that the two shared a special or even “divine” bond, roping the user into conspiratorial delusions, or insisting to a would-be stalker that they’d been gravely wronged by their target. In some cases, our reporting found, ChatGPT continued to stoke users’ obsessions as they descended into unwanted harassment, abusive stalking behavior, or domestic abuse, traumatizing victims and profoundly altering lives.

-snip-

Read more: https://futurism.com/artificial-intelligence/ai-abuse-harassment-stalking



Futurism asked OpenAI detailed questions about this story. The company hasn't responded.

Much more at the link - no paywall - and I hope you'll read all of it. The third paragraph quoted above - the one starting "In some videos..." - is part of the description of the man''s behavior after they broke up and he moved out.
7 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
AI Delusions Are Leading to Domestic Abuse, Harassment, and Stalking (Original Post) highplainsdem 21 hrs ago OP
I will say it again UpInArms 21 hrs ago #1
It can be used for evil FakeNoose 20 hrs ago #5
;-{)...... Goonch 21 hrs ago #2
AI has already killed people indirectly/directly Miguelito Loveless 20 hrs ago #3
the very first time a guy involved AI in our personal stuff Skittles 20 hrs ago #4
Thanks for the link, it's very enlightening. Talitha 19 hrs ago #6
I wonder if psychology sees any parallel to cults? Kali 18 hrs ago #7

FakeNoose

(40,938 posts)
5. It can be used for evil
Tue Feb 17, 2026, 01:03 PM
20 hrs ago

In this example (OP link) the boyfriends and husbands are using the ChatGPT ap as a replacement for human interaction. If the human wife or girlfriend were actually available, the husband/boyfriend would have preferred the human. Or so we are led to believe.

In actual fact, there's no proof that the guy was giving a fair description of the woman's behavior to the "Chat ap." Anything being left out, including any fault or guilt on the part of the guy, is going to give an incomplete story. Naturally....

So of course the Chat ap replies in a way that favors the guy's point of view, just like any one-sided friendship would do. The real human woman never has a chance, and that's how the whole thing is set up. How many husbands get perfect agreement from their own wives? Very few, but they do get it from the ChatGPT "girlfriend."

This proves how hopelessly one-sided ChatGPT is always going to be. It's just another feedback loop that mirrors and confirms the point of view of the user that's being fed into it.

Miguelito Loveless

(5,581 posts)
3. AI has already killed people indirectly/directly
Tue Feb 17, 2026, 12:48 PM
20 hrs ago

This will only get worse as it is deployed more widely with fewer, if any, safeguards.

Talitha

(7,805 posts)
6. Thanks for the link, it's very enlightening.
Tue Feb 17, 2026, 02:07 PM
19 hrs ago

And frightening, the way it can overtake some people - like a cult does. I can't understand using it like a 'friend'. Too weird, IMO.

On the flip-side of the coin, my daughter-in-law likes it a lot because it's such a time-saver. She uses it on her job to organize her outlined thoughts and summarize them for presentations.

Latest Discussions»Latest Breaking News»AI Delusions Are Leading ...