Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

DonaldsRump

(7,715 posts)
Sat Feb 6, 2021, 08:12 PM Feb 2021

Please help me understand: why shouldn't online sites be held accountable

Last edited Sun Feb 7, 2021, 06:55 AM - Edit history (2)

for defamation broadcast through their messaging systems?

The Supermatic and Dominion defamation lawsuits clearly demonstrate that massive civil damages claims against broadcasters of defamatory words carry serious liability and may actually change their willingness to allow this nonsense. IMO, this will greatly help to reduce outright lies that are being widely disseminated.

What's wrong with holding online sites that host defamatory content liable, regardless of whether the content is what they generated or what their users generated?

Let me contrast this from an ISP that does not host sites. There is clearly a distinction there.

I would love your thoughts.

On edit: I've found this discussion truly fascinating, and I have enjoyed hearing others' views. I do want to make it clear that I don't profess to know the "best" or "right" answer to this vexing issue. Nor am I advancing any particular solution.

What I am certain of is that SOMETHING must be done. For example, how did the nonsense that Q-anon has propagated spread so widely and so quickly? Through outright lies totally designed to destroy people (e.g., Hillary Clinton) and the organizations of which they are a part (e.g., the Democratic Party). This took around 2-3 years. And look what it's done.

It's fine that the mainstream social media has taken some significant actions to try to stop this (and Trump's) nonsense. However, all of these purveyors of lies simply move on to another online site.

There has to be a way to hold sites accountable for the content they broadcast/disseminate, whether it's their own content or user-generated content. If not, dangerous garbage like Q-anon will continue to propagate. And foreign actors (e.g., Russia) or domestic actors (i.e., our lovely homegrown domestic terrorists) will use the internet as their unchecked tool to do what they want.

I used to think I was a First Amendment absolutist. I realize that there are clear boundaries, though, with the First Amendment. Defamation is one of them. Using defamation to stop the spread of lies clearly has worked by the Smartmatic and Dominion lawsuits, merely by filing complaints. Lou Dobbs is gone, Fox is being reined in, OAN is tamping down its nonsense. Not for one moment do I believe that it will stop all of their individual or collective nonsense, but it sure doesn't hurt.

The same tool could be applied to online social media and blogging sites. If there are other or better tools, I'd love to hear them. The point is that SOMETHING has to be done: this cannot go on any longer.

I am not talking about chilling opinion (e.g., "I hate Senator X" ). What I am talking about is stopping lies (e.g., "Senator X eats babies. Senator X is a Democrat. Therefore, the Democratic Party is a bunch of baby eaters." ) from propagating so widely and so quickly. It has to stop.

72 replies = new reply since forum marked as read
Highlight: NoneDon't highlight anything 5 newestHighlight 5 most recent replies
Please help me understand: why shouldn't online sites be held accountable (Original Post) DonaldsRump Feb 2021 OP
should DU be liable for what you post/say here? after all what you think is fine may by msongs Feb 2021 #1
Doesn't that apply to print media as well? DonaldsRump Feb 2021 #3
How many readers' letters do print media publish? Very few. muriel_volestrangler Feb 2021 #10
but they do publish comments dsc Feb 2021 #26
When they publish comments on the web, they're covered by section 230 as well muriel_volestrangler Feb 2021 #47
What does it matter if destroys sites like DU or Daily Kos? Demsrule86 Feb 2021 #25
I think perhaps frazzled Feb 2021 #5
You could be sued even if it is taken down...it will be abused by the right. Demsrule86 Feb 2021 #27
Good article here. Cattledog Feb 2021 #2
Great cite, and thank you DonaldsRump Feb 2021 #4
TV broadcasts, newsprint articles, etc. have editors. Make7 Feb 2021 #12
I'm actually fine with that DonaldsRump Feb 2021 #13
It is quite simple, they can control what is printed. Demsrule86 Feb 2021 #28
Then there is no accountability and anything goes? DonaldsRump Feb 2021 #46
They are protected but they should be held accountable JI7 Feb 2021 #6
Think of a hosting service or website as a house with owners. cayugafalls Feb 2021 #7
Not exactly DonaldsRump Feb 2021 #8
I'm pretty certain the paper would be found not liable. cayugafalls Feb 2021 #11
I am not talking about opinions DonaldsRump Feb 2021 #16
I do not agree with holding them to the same standards. cayugafalls Feb 2021 #22
It is PRECISELY because of the "massiveness" (if that's a word) of data DonaldsRump Feb 2021 #23
Fight misinformation with information. cayugafalls Feb 2021 #33
No need to make ad hominem arguments DonaldsRump Feb 2021 #35
My apologies, I will edit... cayugafalls Feb 2021 #37
It won't stop there...leave the internet free. Demsrule86 Feb 2021 #29
Ok. So shall we allow allow media to dispense whatever lies they want to? nt DonaldsRump Feb 2021 #32
I, like you, want the internet to remain a free an open community. cayugafalls Feb 2021 #36
Suppose a person put up a fence and invited people to do graffiti and post messages on it Klaralven Feb 2021 #9
If the fence owner takes it down right away, that's relevant DonaldsRump Feb 2021 #14
The fence owner may not know whether the message is defamatory Klaralven Feb 2021 #15
And if the fence owner doesn't, does that make them liable? nt DonaldsRump Feb 2021 #17
I think they have a duty to take it down just as if a copyright holder complains about an owned work Klaralven Feb 2021 #18
I totally agree DonaldsRump Feb 2021 #19
It is a very bad idea... Demsrule86 Feb 2021 #30
How much time should the owner of DU spend reading every post? brooklynite Feb 2021 #20
What about a small town newspaper? DonaldsRump Feb 2021 #21
An editorial decision to publish false statements as fact is already liable for civil action... brooklynite Feb 2021 #39
Just Consider the Volume of the Posts on DU TuskMoar Feb 2021 #43
That's exactly the issue DonaldsRump Feb 2021 #44
No, the jury system is not a vetting system muriel_volestrangler Feb 2021 #48
It is the start of a vetting system DonaldsRump Feb 2021 #49
You'd be laughed out of Court if you presented the Jury system as a defense..... brooklynite Feb 2021 #53
Glad to hear you think DU's jury system is worthless DonaldsRump Feb 2021 #54
It works terribly as an arbiter of FACT.... brooklynite Feb 2021 #55
I kinda' understand the issue of civil liability DonaldsRump Feb 2021 #56
An example of the jury system in action on Discussionist muriel_volestrangler Feb 2021 #67
I totally agree DonaldsRump Feb 2021 #68
DU's Vetting System Would Not Hold Legal Muster TuskMoar Feb 2021 #59
I have said this multiple times DonaldsRump Feb 2021 #61
GAWKER was Bankrupted for Reporting the Truth TuskMoar Feb 2021 #63
DU has a system in place to remove posts . Twitter KNEW the shit Trump was posting and did nothing JI7 Feb 2021 #50
If you allow websites to be sued for content posted by others, you open the door for Demsrule86 Feb 2021 #24
I am totally in favor of DU, but I am not in favor of other sites DonaldsRump Feb 2021 #31
And that's fine. Dr. Strange Feb 2021 #64
In Spite of What People Say TuskMoar Feb 2021 #34
So you are say anything is fine? nt DonaldsRump Feb 2021 #38
I am Saying You Are Proposing the Wrong Solution TuskMoar Feb 2021 #40
So you're fine with anything goes then? DonaldsRump Feb 2021 #45
That is Not at All What I Said TuskMoar Feb 2021 #57
The critical issue is with those fora that have no rules DonaldsRump Feb 2021 #58
Hold the Person Who Made the Post TuskMoar Feb 2021 #71
See posts 69 and 70 DonaldsRump Feb 2021 #72
I understand that you want something done Marrah_Goodman Feb 2021 #66
True, but you could say that about many small businesses, online or not? DonaldsRump Feb 2021 #69
Might as well just imprison everyone who opposes you if you want to go that far ansible Feb 2021 #41
That's the way it is for print and broadcast media. DonaldsRump Feb 2021 #42
Right , there are ALREADY standards in place to control these things. Just apply it to social media JI7 Feb 2021 #51
Agreed. DonaldsRump Feb 2021 #52
My opinion, Trump is an immoral, dishonest skunk who deserves jail. KentuckyWoman Feb 2021 #60
Better options: BarackTheVote Feb 2021 #62
Frankly, the courts would be overwhelmed with lawsuits Marrah_Goodman Feb 2021 #65
See my post at #69 DonaldsRump Feb 2021 #70

msongs

(67,502 posts)
1. should DU be liable for what you post/say here? after all what you think is fine may by
Sat Feb 6, 2021, 08:16 PM
Feb 2021

deemed libelous by another lawsuit-happy entity

DonaldsRump

(7,715 posts)
3. Doesn't that apply to print media as well?
Sat Feb 6, 2021, 08:23 PM
Feb 2021

Last edited Sat Feb 6, 2021, 09:12 PM - Edit history (1)

In today's world, there are probably very few PURE print media with no online presence. But I am curious about the analogy.

I hold no views, but it does seem to me that if an entity is in the business of providing content, WHATEVER the source, there needs to be more than just a modicum of accountablity.

muriel_volestrangler

(101,414 posts)
10. How many readers' letters do print media publish? Very few.
Sat Feb 6, 2021, 08:45 PM
Feb 2021

Because they have to read them first, and consider whether they'll cause trouble. If all social media, blog and discussion board content has to be pre-approved (not necessarily legally, just by someone with a general idea of what's acceptable and what's not) , the web will work as a fraction of its current pace, and could require people wishing to comment to pay (small blogs can cope with checking a few comments without charging, but DU couldn't, let alone Twitter or Facebook).

dsc

(52,173 posts)
26. but they do publish comments
Sat Feb 6, 2021, 10:11 PM
Feb 2021

I do wonder why it is a massive difference between the comments posted on the NYT webpage and the letters to the editor of the NYT.

muriel_volestrangler

(101,414 posts)
47. When they publish comments on the web, they're covered by section 230 as well
Sun Feb 7, 2021, 05:55 AM
Feb 2021

The available space in print is limited; they'd be reading and judging them to decide which are the best anyway. The web can have thousands if it wants; the Disqus system allows you to order them by "most liked" since you don't want to plough through them all.

frazzled

(18,402 posts)
5. I think perhaps
Sat Feb 6, 2021, 08:33 PM
Feb 2021

they should if a post threatening violence against an individual or institution were allowed to stand. That doesn’t happen here I think because such posts are removed, and I suppose the poster might be banned.

It gets dicier when the issue is false information. I see that with more frequency, but other posters usually step in to point out the facts.

Calling someone (a public figure) a nasty name is probably not subject to liability, since it’s just an opinion.

Cattledog

(5,923 posts)
2. Good article here.
Sat Feb 6, 2021, 08:18 PM
Feb 2021
https://www.nytimes.com/2020/05/28/business/section-230-internet-speech.html

When the most consequential law governing speech on the internet was created in 1996, Google.com didn’t exist and Mark Zuckerberg was 11 years old.

The federal law, Section 230 of the Communications Decency Act, has helped Facebook, YouTube, Twitter and countless other internet companies flourish.

But Section 230’s liability protection also extends to fringe sites known for hosting hate speech, anti-Semitic content and racist tropes like 8chan, the internet message board where the suspect in the El Paso shooting massacre posted his manifesto.

The First Amendment protects free speech, including hate speech, but Section 230 shields websites from liability for content created by their users. It permits internet companies to moderate their sites without being on the hook legally for everything they host. It does not provide blanket protection from legal responsibility for some criminal acts, like posting child pornography or violations of intellectual property.

DonaldsRump

(7,715 posts)
4. Great cite, and thank you
Sat Feb 6, 2021, 08:32 PM
Feb 2021

I'm not an expert on the CDA and particularly Section 230.

However, I am scratching my head trying to understand how websites that carry user content are different from purely print medium that carries defamatory conduct either from its own employees/agents or from users (e.g., letters to the editor, OpEds, etc.)

I truly don't understand the distinction.

Make7

(8,543 posts)
12. TV broadcasts, newsprint articles, etc. have editors.
Sat Feb 6, 2021, 08:56 PM
Feb 2021

Content does not get broadcast or printed with being produced and reviewed by people employed by the company.

Social media content from users gets posted without any review by anyone. Some sites are better at moderating that content after it's posted than others.

Facebook has 2.8 billion users. If Facebook was legally liable for anything any user posted, they would not allow people to post a single word before an employee reviewed it. So they would need a lot more employees, and your post might be displayed in a week or two.

DonaldsRump

(7,715 posts)
13. I'm actually fine with that
Sat Feb 6, 2021, 09:01 PM
Feb 2021

Last edited Sat Feb 6, 2021, 10:07 PM - Edit history (1)

Otherwise, do we want Qanon etc to have completely free rein to post whatever they/it wants?

If there is no curtailment of the nonsense that has happened for the last many years, posters can pretty much do what they want. I am not talking about pure opinion such as "I hate Senator X". I am talking about "I hate Senator X because she eats babies", when there is no doubt that Senator X does not eat babies.

Why shouldn't the forum hosting the latter be held liable for that?

DonaldsRump

(7,715 posts)
46. Then there is no accountability and anything goes?
Sun Feb 7, 2021, 05:54 AM
Feb 2021

From what I gather, that's basically the situation now. Anyone can engage in a massive distortion/lie onine (e.g., "Hillary Clinton eats babies" ) and face no consequences?

That clearly allows social media etc. to be weaponized.

Social media can clearly control what content is hosted on their sites. It can do so with internal vetting, a jury system consisting of users, or both.

I don't have a view, but I truly am struggling to see the difference between broadcast/print media having responsibility for its own content or even letters to the editors/op-ed pieces that legally constitute defamation versus social media, where the content is generated by users. The social media site is providing the forum to express the user's statement (and is quite possibly monetizing that). Why shouldn't the social media site have a legal responsibility to ensure there is no defamatory content.

Otherwise, we will remain in the situation we are in today. Just think about Qanon. How did it arise in just 2 years? Social media. How did it arise? Through blatant lies.

cayugafalls

(5,669 posts)
7. Think of a hosting service or website as a house with owners.
Sat Feb 6, 2021, 08:37 PM
Feb 2021

The owners invite people over for a party.

One of the party goers says something defamatory and slanderous about another guest.

The owner of the house is then sued and his house and money are seized all because he had a guest whose speech was deemed inappropriate.

Seems fair, right?

DonaldsRump

(7,715 posts)
8. Not exactly
Sat Feb 6, 2021, 08:40 PM
Feb 2021

The house owner does not broadcast to literally billions of listeners.

Put it this way: if any newspaper in its print edition carried a letter to the editors that Public Person X was definitely engage in horribly illegal behavior, I'm pretty certain that the newspaper would be held liable. Why does that not apply to online fora?

cayugafalls

(5,669 posts)
11. I'm pretty certain the paper would be found not liable.
Sat Feb 6, 2021, 08:54 PM
Feb 2021

The person was not employed by or representing the paper in any way. Letters to the editor are simply opinions of one persons viewpoint and not an indication of the views of said paper. It even states that in most newspapers LTE sections.

The problem with holding online forums responsible for the content and views of their guests is that you then cross the line into holding all citizens accountable for the speech of others if they are "on your forum".

Free speech is just that. The current laws on the books for online forums protect the owners from being held liable for content that is not expressly their own. Hosting improper content in the form of speech does not and should not make you accountable for what others say, while exercising their right to free speech.

The internet should be free from censorship of any kind and once you begin censoring speech and holding "owners" accountable you give up control to those whose opinions may be more nefarious than yours, i.e. the government. The only speech that will be allowed in your scenario will be speech that is "approved".

No thank you.

DonaldsRump

(7,715 posts)
16. I am not talking about opinions
Sat Feb 6, 2021, 09:08 PM
Feb 2021

I am talking about the difference between "Senator X is awful" vs. "Senator X kills and eats babies." The former is opinion and the latter is a statement of fact.

I am virtually certain that letters to the editors and op-eds are subject to some standard of a defamation check. Why wouldn't online purveyors of erroneous facts be held to that same standard.

The right to free speech under the First Amendment is abrogated and covered only by government, not private actors. Defamation is a CLEARLY accepted exception to an absolutist "right to free speech."

cayugafalls

(5,669 posts)
22. I do not agree with holding them to the same standards.
Sat Feb 6, 2021, 09:59 PM
Feb 2021

Simply put the sheer volume of data posted to an online forum like Twitter is massive. Approximately 500-600 Million post per day. Assume 20-30 words average and you have to scan over 1.5e10 words each and every day. That is equal to the content of over 182 years of the New York Times.

I like the direction that the author of that article takes and that is to allow AI and human intelligence to monitor for certain forms of hate speech and misinformation and when found, rather than delete the data and false information, provide a clear and concise rebuttal based on scientific fact and truth based information. Inform those that are ill-informed, teach rather than prosecute. "Fight misinformation with information."

A good discussion, nonetheless.

DonaldsRump

(7,715 posts)
23. It is PRECISELY because of the "massiveness" (if that's a word) of data
Sat Feb 6, 2021, 10:02 PM
Feb 2021

that online fora have an even greater responsibility, NOT a lesser responsibility. That is the medium in which they operate and they must take appropriate standards, since their reach is potentially FAR wider.

cayugafalls

(5,669 posts)
33. Fight misinformation with information.
Sat Feb 6, 2021, 10:17 PM
Feb 2021

I will have to leave my answer here.

My opinion is clear, simply read the title.

As your avatar denotes...Peace.

DonaldsRump

(7,715 posts)
35. No need to make ad hominem arguments
Sat Feb 6, 2021, 10:22 PM
Feb 2021

I am asking this as a matter of reason. I am for peace, but that does not change the basic question: is it fine to allow online media to get away with lies because they are online?

Simply understand the very basic question posted. I am not hell bent on anything.

Demsrule86

(68,825 posts)
29. It won't stop there...leave the internet free.
Sat Feb 6, 2021, 10:13 PM
Feb 2021

News papers can be sued has it stopped the right wing lies. No.

cayugafalls

(5,669 posts)
36. I, like you, want the internet to remain a free an open community.
Sat Feb 6, 2021, 10:24 PM
Feb 2021

I simply pointed to an article that promotes rather than censor or fines, fight misinformation with information.

At every turn we have to be prepared to push back on lies and dangerous speech with truth and facts.

I am a member of the EFF, so I get what you're saying.

Peace.

https://www.eff.org

 

Klaralven

(7,510 posts)
9. Suppose a person put up a fence and invited people to do graffiti and post messages on it
Sat Feb 6, 2021, 08:45 PM
Feb 2021

Should the person who put up the fence be responsible for what is written on it?

I think that the person owning the fence should be responsible only for identifying who wrote what, so that the originators of the messages can be sued for slander, etc. by the wronged person.

DonaldsRump

(7,715 posts)
14. If the fence owner takes it down right away, that's relevant
Sat Feb 6, 2021, 09:02 PM
Feb 2021

If the fence owner lets it stay in perpetuity, that's an entirely different issue.

 

Klaralven

(7,510 posts)
15. The fence owner may not know whether the message is defamatory
Sat Feb 6, 2021, 09:08 PM
Feb 2021

In the case of web sites, if the allegedly defamed individual complains, most do take the messages down.

 

Klaralven

(7,510 posts)
18. I think they have a duty to take it down just as if a copyright holder complains about an owned work
Sat Feb 6, 2021, 09:12 PM
Feb 2021

DonaldsRump

(7,715 posts)
19. I totally agree
Sat Feb 6, 2021, 09:17 PM
Feb 2021

A "take down" notice to a fence owner has as much relevancy as it does to a website forum. If neither complies, they should both hold themselves, and be held, to liability.

I marvel at what the Smartmatic and Dominion lawsuits have done to some of the worst purveyors of lies (not opinions...OUTRIGHT LIES) like Fox News and its employees. They are terrified of massive civil damages. That is a very good thing.

I honestly am wondering why this same standard should not apply to fence owners, to websites, to MSM, and to political hacks like Roger Stone, etc. We could cure so many of the ails that face us.

If you put yourself out to the public, you have responsibility. The issue is the extent of that responsibility.

brooklynite

(95,012 posts)
20. How much time should the owner of DU spend reading every post?
Sat Feb 6, 2021, 09:35 PM
Feb 2021

The rule doesn’t apply to just big money making platforms.

DonaldsRump

(7,715 posts)
21. What about a small town newspaper?
Sat Feb 6, 2021, 09:49 PM
Feb 2021

So, if the website owner is "small", they are free to post anything, whether its their own content or a user's post? There is no liability whatsoever for complete factual lies conveyed on their site?

brooklynite

(95,012 posts)
39. An editorial decision to publish false statements as fact is already liable for civil action...
Sat Feb 6, 2021, 11:18 PM
Feb 2021

The underlying issue of Section 230 is whether a publisher should be responsible for something that ANOTHER author (frequently an anonymous one) chooses to right without editorial review. Some website and social media platforms will hold up publication of a post until moderation occurs. Most do not.

I am one of the few people here who doesn't hide my identity. As a consequence I think carefull about what I post as "fact" as opposed to "opinion"/

TuskMoar

(83 posts)
43. Just Consider the Volume of the Posts on DU
Sun Feb 7, 2021, 04:50 AM
Feb 2021

and multiply that by other sites like it. This is NOT the same as a small town or a big town newspaper wherein everything that is in the paper is vetted prior to publication.

DonaldsRump

(7,715 posts)
44. That's exactly the issue
Sun Feb 7, 2021, 04:58 AM
Feb 2021

Last edited Sun Feb 7, 2021, 05:34 AM - Edit history (1)

Why shouldn't it be vetted?

DU does this that via MIRT, the jury system etc. As such, it has an excellent vetting process. By contrast, other sites have NOTHING. Why shouldn't they be exposed to liability for this failure to have nothing to safeguard against defamation.

You're making my point.

On edit: Let me rephrase my last sentence. "You're making the point for vetting/liability."

I actually don't have a view on this whole issue. What I'm looking for is principled reasons why a social media or blog site should be held to a different standard than broadcast or print media. If there is no legal liability, social media can be (and is being) used as a disinformation weapon. That's my starting point, and I'm trying to find solutions to that.

muriel_volestrangler

(101,414 posts)
48. No, the jury system is not a vetting system
Sun Feb 7, 2021, 06:04 AM
Feb 2021

It waits until people have seen the problematic post; it relies on some member deciding to alert; and it relies on random members making a subjective judgement call. If you wanted to see that it's not a process that can always be trusted, you could look at Discussionist when it existed; the alerts and juries were used as weapons, and frequently produced shit results.

MIRT is a different animal; it's basically ideological. It keeps out the kind of people who made Discussionist a swamp, both in their posting and their use of/service on juries.

DonaldsRump

(7,715 posts)
49. It is the start of a vetting system
Sun Feb 7, 2021, 06:13 AM
Feb 2021

Usually, the last step before defamation actions commence is a "take down" notice sent to the hosting/broadcasting entity.

There are MANY steps that can be taken to stop defamatory words from being disseminated. Juries and legal letters are two of them.

My point is that there needs to be some accountability. Otherwise, there is a free-for-all and anything goes (like it is today).

Look where it's gotten us. Qanon, for example, grew into a monster in just two years, all predicated on complete lies that relied on immune social media to reach millions (billions?) of people.

How exactly do you stop that? Someone or something needs to be liable. Otherwise, it will continue. And it can't continue any longer.

DonaldsRump

(7,715 posts)
54. Glad to hear you think DU's jury system is worthless
Sun Feb 7, 2021, 11:03 AM
Feb 2021
It works very well. Maybe not for other sites, but DU's is really good.

It is a rapid-fire way of getting rid of a lot of problematic posts.

Please read what I said: it's part of an overall anti-defamation liability strategy. In addition, there's internal administration and then there are lawyers sending "take down" notices. Companies use escalation strategies like this all the time to have a quick and cost-effective way to prevent legal issues from arising.

brooklynite

(95,012 posts)
55. It works terribly as an arbiter of FACT....
Sun Feb 7, 2021, 11:22 AM
Feb 2021

...which what a potential Libel/Slander lawsuit would be based on.

Here are the Jury Alert qualifications:

Civility
No personal attacks or flaming
No divisive group attacks
No bigotry/insensitivity

Political
Support Democrats
Don't bash Democratic public figures
Don't peddle right-wing talking points, smears, or sources
Don't keep fighting the last Democratic presidential primary

Content
Don't interfere with forum moderation
No graphic content
No kooky, extremist, or hate content
No commercial spam

OP Content
Don't violate a forum or group's Statement of Purpose

Legal/Administrative
Respect copyrights
Don't post anyone's private or personal information
No malware, phishing, cracking, or other malicious code
Don't post anything that violates U.S. law
Don't use an avatar or signature line that violates any of the other rules

None of these categories challenge the factual nature of a post. That is what a civil Court action would be based on.

DonaldsRump

(7,715 posts)
56. I kinda' understand the issue of civil liability
Sun Feb 7, 2021, 11:34 AM
Feb 2021

I'm a lawyer.

And a website that takes precautions is going to be looked upon more favorably than one that doesn't. As I have said, it's part of (but not exclusively) an overall strategy to mitigate liability.

And I totally disagree with you on the scope of DU's ToS. There are at least two that are directly relevant to factual assertions in DU's TOS: No kooky, extremist, or hate content and Don't post anything that violates U.S. law. I've been on juries or MIRT where the factual correctness was at issue under either of those categories.

For example, if I posted "Senator X murdered Baby G and ate her", I'd be pretty sure that the DU jury, possibly MIRT , and/or the Admins would have that post deleted in a matter of minutes or seconds under the "no kooky" category. No need to get lawyers involved. And it shows that DU has a responsible system in place to flag flagrant issues like my example.

muriel_volestrangler

(101,414 posts)
67. An example of the jury system in action on Discussionist
Sun Feb 7, 2021, 03:34 PM
Feb 2021

Someone posted that all French Imams should be shot. I alerted on it, and it was left standing - I think 5-2 was the verdict in its favour. It was hate content, but a majority of the jury agreed with it, so were happy. It depends on having a reasonable population to draw from, and an online forum that allows the far right on can't manage that.

DonaldsRump

(7,715 posts)
68. I totally agree
Sun Feb 7, 2021, 03:42 PM
Feb 2021

I don't really post much. DU's my principal frame of reference, and there are an unusually high number of extremely well-informed, honest, and well-meaning folks here.

That probably skews my thinking a bit about juries!

TuskMoar

(83 posts)
59. DU's Vetting System Would Not Hold Legal Muster
Sun Feb 7, 2021, 01:07 PM
Feb 2021

The "community" can not be the arbiters of truth.

Moreover, the legal protections against frivolous lawsuits or toothless. Can someone remind me about the details of the website that was purposively put into bankruptcy for posting a vetted and truthful article about a very rich person who retaliated with a losing lawsuit and bankrupted the website? Sorry I don't have the time and energy to look it up now.

DonaldsRump

(7,715 posts)
61. I have said this multiple times
Sun Feb 7, 2021, 01:11 PM
Feb 2021

The jury system is part of a liability mitigating strategy. It is not the only tool. Admin, moderators, jury, and lawyers are the main parts of the strategy.

Frivolous lawsuits can CLEARLY be countered: here's an example of one just posted on DU - https://www.democraticunderground.com/100215066849 . Lawyers bringing frivolous cases can also be sanctioned by their licensing authority.

TuskMoar

(83 posts)
63. GAWKER was Bankrupted for Reporting the Truth
Sun Feb 7, 2021, 02:09 PM
Feb 2021

Tesla has sued for unfavorable press many times. Lawsuits are expensive and risky and if I owned a social media platform, I wouldn't "trust" the ability of said sanctions to keep me out of legal jeopardy.

JI7

(89,289 posts)
50. DU has a system in place to remove posts . Twitter KNEW the shit Trump was posting and did nothing
Sun Feb 7, 2021, 06:38 AM
Feb 2021

about it until the terrorist attack and an innocent officer was killed.

Demsrule86

(68,825 posts)
24. If you allow websites to be sued for content posted by others, you open the door for
Sat Feb 6, 2021, 10:08 PM
Feb 2021

attack on sites like DU who won't be able to afford it in the end and it will diminish our ability to get out the progressive message.

DonaldsRump

(7,715 posts)
31. I am totally in favor of DU, but I am not in favor of other sites
Sat Feb 6, 2021, 10:15 PM
Feb 2021

whether big or small that allow lies to be blasted out to untold numbers of persons.

Dr. Strange

(25,929 posts)
64. And that's fine.
Sun Feb 7, 2021, 03:17 PM
Feb 2021

You can view DU as different and have a different standard for it. But the law can't. If the law allows websites to be taken down for users posting lies, then you're going to see that law weaponized. And smaller websites won't be able to protect themselves.

TuskMoar

(83 posts)
34. In Spite of What People Say
Sat Feb 6, 2021, 10:18 PM
Feb 2021

Social media can be and often is a force for good. I am not willing to burn it all down because some use it negatively. Holding these sites accountable will be the end of all social media.

TuskMoar

(83 posts)
40. I am Saying You Are Proposing the Wrong Solution
Sun Feb 7, 2021, 04:07 AM
Feb 2021

The solution you are proposing is chemotherapy. Moreover, social media can not be stopped. These policies will do nothing but push it underground where it is even more difficult to monitor. I "might" be in favor of requiring people to provide IDs to social media sites and make policy so those are subject so subpoena. The privacy nuts would bemoan such a policy, but it would give people a way to sue the people making posts and not the hosting website itself. In other words, I am in favor of reducing the anonymity of social media and make people take responsibility for their posts, but not make those who make the forum available responsible for the posts.

DonaldsRump

(7,715 posts)
45. So you're fine with anything goes then?
Sun Feb 7, 2021, 05:15 AM
Feb 2021

Do nothing and hold no one accountable?

That is currently the situation. Look where it has gotten us.

Then look at what the Smartmatic and Dominion lawsuits did to quell disinformation. That can be abused too, but folks (like the lawyers and their clients) can be held liable for courts if there is abusive litigation.

On edit: here's a perfect example of how abusive/frivolous litigation can be challenged. https://www.democraticunderground.com/100215065976

TuskMoar

(83 posts)
57. That is Not at All What I Said
Sun Feb 7, 2021, 01:00 PM
Feb 2021

Read my post. I am in favor of holding posters accountable for their posts. I am not in favor of holding the forums accountable. Equating social media to newspaper and broadcast media is inaccurate and misleading. I am posting on DU right now and you and I are having a debate in nearly real time. If my post or your post (or anyone else's) has to be cleared by a legal team prior to posting, our debate is not possible to have.

Although I stipulate that there is a problem to the spread of disinformation and I am open to policies that will inhibit the spread if disinformation, I will not be supportive of your proposed solution to hold social media websites or apps (DU, KOS, Twitter, FB, etc.) liable for the posts that are made on those platforms. Mark Zuckerberg is a weird dude about whom I have a high degree of skepticism, but he does not believe in Jewish Space Lasers and he should not be accountable for MTG's posting on FB. I am fully in favor of holding MTG to account for her posts on FB, but not FB or Mark Zuckerberg.

I guess I just don't get why the equating print and broadcast media is the same as internet forums. They are very clearly different. I also don't get why holding people who make and distribute the information accountable instead of the platforms is not enough. It can be done and it is not that hard to do.

Social media is a new superpower that can and is used for both good and evil. I am not yet ready to give up the good to quash the evil. Without social media, would be have known about the key role Stacy Abrams played in Georgia? I don't think I would have know about it. Because of social media, I had the opportunity to donate to the GA senate races. I don't live in GA. Those senate races were nationalized and won because of social media. And, this is only a single of many examples of the benefits of social media. Your proposal would require ever single post to be cleared by a legal team prior to being made public.

If your proposal were to be put into effect, I could not even complain about my Tesla (I don't own one). But, do you know how litigious Elon Musk is about negative Tesla reviews? Many professional car reviewers won't even review it because of his history of lawsuits against negative reviews unless they have a well funded legal department. DU, Kos, FB, and Twitter would die if everything had to be vetted prior to being made public.

DonaldsRump

(7,715 posts)
58. The critical issue is with those fora that have no rules
Sun Feb 7, 2021, 01:05 PM
Feb 2021

How do you deal with them. No one is accountable there.

And if state on a forum that "Senator X killed and ate a baby" and no one does anything about, why shouldn't the forum be held accountable?

TuskMoar

(83 posts)
71. Hold the Person Who Made the Post
Sun Feb 7, 2021, 04:01 PM
Feb 2021

Accountable. I am in favor of policies to reduce anonymity on forums and holding posters accountable. Applying old standards of print media to modern social media is unreasonable.

DonaldsRump

(7,715 posts)
72. See posts 69 and 70
Sun Feb 7, 2021, 04:11 PM
Feb 2021

It's being done for data privacy and consumer protection, around the world and in the US, for all companies, online or otherwise. There's no reason it couldn't be done for defamation carried by online companies, and there are many compelling reasons to do it.

Of course, all of this requires (a) a will to do it; (b) legislative changes; and (c) creation of a body with regulatory oversight.

I don't disagree with your idea to require the online company to get verifiable ID to hold the individual liable, but you need to have a law to force that to occur too. To my knowledge, there isn't such a law. So there's no way to enforce this except by voluntary compliance.

In fact, Senator Wyden, who was one of the original sponsors of the CDA, warned tech companies that they may lose Section 230 protection if they don't shape up. For companies like Twitter and Facebook that are publicly traded, they need to step up to the plate and, to some degree, they have.

I am far more concerned about private companies funded by who knows who. Unless they are legally compelled to do something and there is a meaningful sanction if they don't, they won't do anything. For some of them, their raison de etre is to disseminate damaging information, whether true or not, to achieve their goals. That's what needs to be addressed.

Marrah_Goodman

(1,586 posts)
66. I understand that you want something done
Sun Feb 7, 2021, 03:31 PM
Feb 2021

But what you are missing is that if people can sue message boards for speech, that will be the end of sites like this one. The owners would shut down rather then have to spend all their time and money fighting lawsuits.

DonaldsRump

(7,715 posts)
69. True, but you could say that about many small businesses, online or not?
Sun Feb 7, 2021, 03:48 PM
Feb 2021

Online companies, big and small, have learned to deal with data privacy regulations around the world and in the US (e.g., GDPR in the EU, California, etc). Why can't there be something similar for the accuracy of information?

Algorithms coupled with human intervention, a notice of a proposed offense, and a time to cure (e.g.., by taking down the offensive post that contains a defamatory lie) before imposing civil liability could easily work. Of course, you would need a regulator with that power. It already exists for consumer protection at both the federal and state levels. It could work for defamatory issues as well.

DonaldsRump

(7,715 posts)
42. That's the way it is for print and broadcast media.
Sun Feb 7, 2021, 04:38 AM
Feb 2021

Last edited Sun Feb 7, 2021, 06:00 AM - Edit history (1)

Why is this different?

And let's be clear: we are NOT talking about imprisonment (although in other countries defamation can be a criminal act).

We are talking about monetary damages that seem to me to act as a great deterrent from the rapid and widespread dissemination of clear lies.

JI7

(89,289 posts)
51. Right , there are ALREADY standards in place to control these things. Just apply it to social media
Sun Feb 7, 2021, 06:40 AM
Feb 2021

also .

DonaldsRump

(7,715 posts)
52. Agreed.
Sun Feb 7, 2021, 06:55 AM
Feb 2021

SOMETHING must be done. What exactly it is, I'm not sure. However, defamation laws seem like a good starting point.

If you get a second, I edited my OP based on the fascinating discussion thus far. I don't profess to know what to do; I just know we have to do SOMETHING!

KentuckyWoman

(6,701 posts)
60. My opinion, Trump is an immoral, dishonest skunk who deserves jail.
Sun Feb 7, 2021, 01:11 PM
Feb 2021

The law would allow skunks to sue DU for defamation.

230 is good in intent, but isn't narrow enough. It needs to be fixed.

BarackTheVote

(938 posts)
62. Better options:
Sun Feb 7, 2021, 01:11 PM
Feb 2021

1) Civics classes to teach people how the government actually works and how to be responsible and discerning members of a democratic society; propaganda-awareness.
2) Poster anonymity should probably become a thing of the past; all accounts should be verified. Would combat bot farms, and create more transparency regarding who you’re talking to online, while also giving people an easier path to legal resource if they feel threatened by a poster. And for that matter—
3) Government agencies should be far more involved in monitoring social media, and they should take threats seriously. Maybe there should be a new agency that does nothing but coordinate with social media platforms. There should be rules for social media companies where they are required not just to delete flagged posts, but to report them to the relevant agencies—that said, such power should not be in the hands of automation; if a post is flagged with the right key words, it should be manually-reviewed by a human being.

Essentially, they need to start treating social media like... the real world... and take the threats that appear on these platforms seriously. The correct course, however, is not to overreact and destroy the internet as we know it.

Social media platforms are like big coffee shops. Different groups sit at different tables, talk among themselves, sometimes mingle, sometimes invite others to join them. And there’s a back room where shady people do shady shit; that back room is bugged, but for some reason, there’s rarely any pre-emptive action and authorities rely almost entirely on someone at the bar—a waitress, or another patron—reporting the shady shit to a manager, who usually just kicks out the person who was caught saying something offensive or violent. Essentially, we’re letting the free market police, basically all by its lonesome, the largest agora in human history.

Edit: One thing I will say is that the algorithms need to be re-tuned. Right now, they’re dumb bots that favor “engaging content”—which, more often than not, boils down to “controversial” content. Falling down a rabbit hold with conspiracy theories leads to a lot of viewer retention, which is all the algorithm wants. Algorithms should be required to function in a more responsible way, that much is 100% true.

Marrah_Goodman

(1,586 posts)
65. Frankly, the courts would be overwhelmed with lawsuits
Sun Feb 7, 2021, 03:29 PM
Feb 2021

And no company would want to host message boards again.

DonaldsRump

(7,715 posts)
70. See my post at #69
Sun Feb 7, 2021, 03:55 PM
Feb 2021

Online companies, big and small, already have to comply with data privacy requirements in the US and around the world. And it wouldn't necessarily result in litigation absent a failure to remove the defamatory post.

Of course, it would require creating a regulator. But, the Federal Trade Commission and the individual states have consumer protection powers. We'll like have data privacy regulators here too.

Let me give you an example. These days, I am learning not to trust reviews on Amazon etc. Apparently, some unscrupulous third-party vendors engage in something called "brushing" that allows them to post fake reviews on Amazon. If Amazon does not watch over this practice (i.e., fake comments posted by third-parties), I would be pretty sure that the feds and/or the state AGs could go after Amazon for allowing this practice to continue. That's not really different than an online site having liability and a duty to cure for defamatory statements.

Latest Discussions»General Discussion»Please help me understand...