Right now, companies like Facebook are not held responsible for the content the people post on their platforms. This is different from newspapers who are responsible for the content they publish. Typically, it is those on the political left who say that Big Tech should do more to police the content on their platforms. From time to time, Democrats talk about introducing new laws to require them to do it.
Those on the political right think Big Tech is doing too much moderation, and think that they are silencing conservative voices.
I think that the broad immunity granted to Big Tech platforms by Section 230 of the Communications Decency Act should be removed. I think Big Tech platforms should be held responsible for the content posted to their platforms. I think that the way the internet works right now is untenable. There is no such thing as a âglobal public squareâ - nor should we try to make one. Itâs not even desirable.
But I think that holding a tech platform, such as your very own Sanityville, responsible for the content produced on it has numerous benefits:
Platforms will be required to vet their users to a much greater degree. This is a good thing. People should have to prove themselves to have their voice amplified by some one elseâs platform.
Right now, we have roomfuls of people watching and reading the most horrific things imaginable all day every day in order to censor them from Youtube and Facebook and the like. This, also, is simply untenable. Again, I think holding the tech platforms responsible, themselves, for the content will make them much less willing to allow people to post. This is a good thing.
If you provide and âownâ a digital space, you should be held responsible for the condition of that space.
If Facebook is held responsible for what you post, they will make that fact abundantly clear to you at every turn. It will become clear that you are there as a guest. I think this is good, because, right now, they make you feel like you own the place. Like you are in charge of your own little section of Facebook. But you donât, and you arenât. Letâs make that obvious.
I think this would motivate platforms to make their own positions more clear, and to be more open about why they censor what they do. Again, this is all to the good.
I suspect that this would result in more, smaller platforms out there. I like that idea.
It is said that Section 230 of the Communications Decency Act âcreated the internetâ. Otherwise, it is argued, the Big Tech platforms with lots to lose would have been overly cautious about what they allowed to be posted, and many wouldnât have even bothered to allow the teeming masses to post.
Well, Iâve seen the internet as we currently have it, and I just donât think it works.
Disagreements would still be allowed to occur on any given platform. The New York Times or the Wall Street Journal should both post opposing viewpoints, and their should be a healthy back-and-forth.
And, of course, you should be able to have your own blog and post whatever you want. Thatâs fine. But you wonât be amplified by someone elseâs platform. The hosting service you use, and your domain name registrar, should not be held responsible for any of your content.
I agree with this. Fewer smaller platforms is, I think, better. Certainly easier to âpolice.â
I donât get this. Is not twitter or facebook basically a hosting service? This seems inconsistent. Twitter is sometimes called a âmicroblog.â It seems like wordpress would be held to the same standard. Is it a size thing or the way a blog seems more personal and intentional? Is it that blogs are harder to access - they dont just come up as you scroll through twitter.
No. Im trying to create a distinction between âplatformsâ, like Twitter, and a âhosting serviceâ, like Godaddy. We use a company called discoursehosting.com to host this forum. They should not be held responsible for the content here - we should be.
Joseph, I think that your point about the feed is crucial to this discussion. A good test is this: who controls the stuff that you see in the âfeedâ on the platform? You? or the platform itself? The more they control it, the more they should be held responsible for the content.
Another way to think about it is that the hosting service is entirely unseen by the user. And the more they want to be seen by the user, the more they should be held responsible for the content.
I see. So the ones held responsible by governing authorities would be the Sanityville administration, not the individual poster. The individual is held responsible by the administration - by deleting, blocking, locking etc. Thereâs an hierarchical chain that in theory would extend to some government body, not necessarily federal.
I think there is already a chain back to the gov. If you break copyright law here, we admins will get notified by the offended party and told to do something. If we refuse to do anything, and/or successfully hide our identities, etc, then they can go to the gov. The gov will go to our hosting company. If they refuse to do something (and are in another jurisdiction) then the gov might take the domain name down. Etc.
@ldweeks, I think somebody needs to do a deeper explanation of what section 230 is, where it came from, why FB et al got immunity, and then why you think they shouldnât.
Iâm interested in a deeper dive of section 230 too. Moving fences and all.
The large platforms (FB, Google, Twitter) use the âglobal public squareâ as their advertising engine. The architecture of these companies is built around the way they can operate now.
Lucas has some wonderful views on what would have happened if the immunity from section 230, in my opinion, had never beenâŚ
I would tread lightly here. I would hew more toward the âplatformâ view that would prohibit providers from censoring speech.
I understand the frustration with what we now mash together and call âbig techâ, and I do think that they are exploiting ambiguities to mainline a left-wing agenda - but making providers like Facebook a âpublisherâ does not fit the reality of what it is. The truth is that information media of 2020 cannot be placed neatly into categories defined for the 1950s.
Thereâs also already a significant human cost when it comes to evaluating and approving content for these platforms. The âcensorship farmsâ in places like the Philippines where it is the job of people to watch the things we donât see on Facebook, Twitter, etc. to make sure they donât get circulated - child pornography, extreme violence and murder, etc. sound like previews to hell and are a strong argument to my mind against the existence of any of these platforms at all.
What sort of regulations are you imagining and who would be enforcing them?
I just donât think having the US Government regulate social media is any better than having the platforms do it themselves. An AOC- or Bernie Sanders-led executive branch (and justice department) wouldnât be any better than the company boards (as obnoxious as they are).
By this reasoning, what is the purpose of any just law?
Iâm open to the argument that the bench needs cleansing before the law books, but that doesnât seem to be the typical endpoint of this type of argumentation.
It sounds like Lucas is simply arguing that some platforms be subject to the same rules as other media channels, that they lose their exception from existing regulations.
Itâs a very tough question, because itâs very hard to figure out where and how to draw the distinctions. What exactly is a publisher vs a provider? Itâs blurry.
Ultimately, having read the Wikipedia page, as well as skimmed the Washington Examiner post, I think itâs a bad idea to get rid of section 230 and the immunity it grants.
The result of removing it would be a hard line distinction between hosting companies that are âjust dumb pipesâ such as Wordpress.com, which does not edit or moderate the content you post on your Wordpress site, and any site that attempted to have any sort of user-posting with moderation. The moment you attempt to moderate, you would become liable. Liable for what?
If somebody posted a comment slandering somebody else, the website they posted it on could be sued for slander. Unless that website never moderated anything at all. Because of bad actors, I think every site would have to either give up any sort of moderation, which would destroy a community like this, as it could be taken over by bad actors at any time, or hold everything for moderation and editing and approval prior to publishing it, which would also destroy communities, even this small.
On balance, I think thatâs bad.
Now, I do think there are some changes that should be made to it. For example, if you are getting paid to post content on your site, I think you should be responsible for that content, but current court interpretation of the law even grants immunity on that, it seems.