Welcome to DU! The truly grassroots left-of-center political community where regular people, not algorithms, drive the discussions and set the standards. Join the community: Create a free account Support DU (and get rid of ads!): Become a Star Member Latest Breaking News General Discussion The DU Lounge All Forums Issue Forums Culture Forums Alliance Forums Region Forums Support Forums Help & Search

ancianita

(36,132 posts)
Mon Mar 29, 2021, 02:00 PM Mar 2021

Beware Of Facebook CEOs Bearing Section 230 Reform Proposals

I want Democrats to be as smart about tech -- undoing what Wm Barr and the DOJ would have done -- as they are about undoing much of the other attempted damage by the pendejo45's autocrats.

Previous discussion on 230: https://upload.democraticunderground.com/100214781635

I hope Senator Warren gives the 230 issue more time and thought as it comes out of the House Committee. Techdirt reports more reasons why the House and Senate need to be smart about 230:

Facebook is now perhaps the leading voice for changing 230, because the company knows that it can survive without it. Others? Not so much.

Last February, Zuckerberg made it clear that Facebook was on board with the plan to undermine 230. Last fall, during another of these Congressional hearings, he more emphatically supported reforms to 230.

And, for tomorrow's hearing, [Thurs, March 25] he's driving the knife further into 230's back by outlining a plan to further cut away at 230. ...

Four separate times, Zuckerberg describes programs that Facebook has created to deal with those kinds of things as "industry-leading." But those programs are incredibly costly ... 35,000 people working in "safety and security," ... more than triple the 10,000 people in that role five years ago.

So, these proposals to create a "best practices" framework, judged by some third party, in which you only get to keep your 230 protections if you meet those best practices, won't change anything for Facebook. Facebook will argue that its practices are the best practices. That's effectively what Zuckerberg is saying ... But that will harm everyone else who can't match that. Most companies aren't going to be able to do this ...

The politics of this obviously make sense for Facebook. It's not difficult to understand how Zuckerberg gets to this point. Congress is putting tremendous pressure on him and continually attacking the company's perceived (and certainly, sometimes real) failings.
So, for him, the framing is clear: set up some rules to deal with the fake problem that so many insist is real, of there being "no incentive" for companies to do anything to deal with disinformation and other garbage, knowing full well that
(1) Facebook's own practices will likely define "best practices" or
(2) that Facebook will have enough political clout to make sure that any third party body that determines these "best practices" is thoroughly captured so as to make sure that Facebook skates by. But all those other platforms? Good luck. It will create a huge mess as everyone tries to sort out what "tier" they're in, and what they have to do to avoid legal liability -- when they're all already trying all sorts of different approaches to deal with disinformation online.

Indeed, one final problem with this "solution" is that you don't deal with disinformation by homogenization. Disinformation and disinformation practices continually evolve and change over time. The amazing and wonderful thing that we're seeing in the space right now is that tons of companies are trying very different approaches to dealing with it, and learning from those different approaches. That experimentation and variety is how everyone learns and adapts and gets to better results in the long run, rather than saying that a single "best practices" setup will work. Indeed, zeroing in on a single best practices approach, if anything, could make disinformation worse by helping those with bad intent figure out how to best game the system. The bad actors can adapt, while this approach could tie the hands of those trying to fight back.

Indeed, that alone is the very brilliance of Section 230's own structure. It recognizes that the combination of market forces (users and advertisers getting upset about garbage on the websites) and the ability to experiment with a wide variety of approaches, is how best to fight back against the garbage. By letting each website figure out what works best for their own community.
As I started writing this piece, Sundar Pichai's testimony for tomorrow was also released. And it makes this key point about how 230, as is, is how to best deal with misinformation and extremism online...


[excerpts from Sundar Pinchai's testimony]

Our ability to provide access to a wide range of information and viewpoints, while also being able to remove harmful content like misinformation, is made possible because of legal frameworks like Section 230 of the Communications Decency Act.

Section 230 is foundational to the open web: it allows platforms and websites, big and small, across the entire internet, to responsibly manage content to keep users safe and promote access to information and free expression. Without Section 230, platforms would either over-filter content or not be able to filter content at all. In the fight against misinformation, Section 230 allows companies to take decisive action on harmful misinformation and keep up with bad actors who work hard to circumvent their policies.

Thanks to Section 230, consumers and businesses of all kinds benefit from unprecedented access to information and a vibrant digital economy. Today, more people have the opportunity to create content, start a business online, and have a voice than ever before. At the same time, it is clear that there is so much more work to be done to address harmful content and behavior, both online and offline.

Regulation has an important role to play in ensuring that we protect what is great about the open web, while addressing harm and improving accountability. We are, however, concerned that many recent proposals to change Section 230—including calls to repeal it altogether—would not serve that objective well. In fact, they would have unintended consequences—harming both free expression and the ability of platforms to take responsible action to protect users in the face of constantly evolving challenges.

We might better achieve our shared objectives by focusing on ensuring transparent, fair, and effective processes for addressing harmful content and behavior. Solutions might include developing content policies that are clear and accessible, notifying people when their content is removed and giving them ways to appeal content decisions, and sharing how systems designed for addressing harmful content are working over time. With this in mind, we are committed not only to doing our part on our services, but also to improving transparency across our industry.


That's standing up for the law that helped enable the open internet, not tossing it under the bus because it's politically convenient. It won't make politicians happy. But it's the right thing to say -- because it's true.


https://www.techdirt.com/articles/20210324/10392546486/beware-facebook-ceos-bearing-section-230-reform-proposals.shtml
Latest Discussions»Issue Forums»Editorials & Other Articles»Beware Of Facebook CEOs B...