Immunity for Big Tech should be removed

Right now, companies like Facebook are not held responsible for the content the people post on their platforms. This is different from newspapers who are responsible for the content they publish. Typically, it is those on the political left who say that Big Tech should do more to police the content on their platforms. From time to time, Democrats talk about introducing new laws to require them to do it.

Those on the political right think Big Tech is doing too much moderation, and think that they are silencing conservative voices.

I think that the broad immunity granted to Big Tech platforms by Section 230 of the Communications Decency Act should be removed. I think Big Tech platforms should be held responsible for the content posted to their platforms. I think that the way the internet works right now is untenable. There is no such thing as a “global public square” - nor should we try to make one. It’s not even desirable.

But I think that holding a tech platform, such as your very own Sanityville, responsible for the content produced on it has numerous benefits:

  1. Platforms will be required to vet their users to a much greater degree. This is a good thing. People should have to prove themselves to have their voice amplified by some one else’s platform.
  2. Right now, we have roomfuls of people watching and reading the most horrific things imaginable all day every day in order to censor them from Youtube and Facebook and the like. This, also, is simply untenable. Again, I think holding the tech platforms responsible, themselves, for the content will make them much less willing to allow people to post. This is a good thing.
  3. If you provide and “own” a digital space, you should be held responsible for the condition of that space.
  4. If Facebook is held responsible for what you post, they will make that fact abundantly clear to you at every turn. It will become clear that you are there as a guest. I think this is good, because, right now, they make you feel like you own the place. Like you are in charge of your own little section of Facebook. But you don’t, and you aren’t. Let’s make that obvious.
  5. I think this would motivate platforms to make their own positions more clear, and to be more open about why they censor what they do. Again, this is all to the good.
  6. I suspect that this would result in more, smaller platforms out there. I like that idea.

It is said that Section 230 of the Communications Decency Act “created the internet”. Otherwise, it is argued, the Big Tech platforms with lots to lose would have been overly cautious about what they allowed to be posted, and many wouldn’t have even bothered to allow the teeming masses to post.

Well, I’ve seen the internet as we currently have it, and I just don’t think it works.

Disagreements would still be allowed to occur on any given platform. The New York Times or the Wall Street Journal should both post opposing viewpoints, and their should be a healthy back-and-forth.

And, of course, you should be able to have your own blog and post whatever you want. That’s fine. But you won’t be amplified by someone else’s platform. The hosting service you use, and your domain name registrar, should not be held responsible for any of your content.

What do you think?


I agree with this. Fewer smaller platforms is, I think, better. Certainly easier to “police.”

I don’t get this. Is not twitter or facebook basically a hosting service? This seems inconsistent. Twitter is sometimes called a “microblog.” It seems like wordpress would be held to the same standard. Is it a size thing or the way a blog seems more personal and intentional? Is it that blogs are harder to access - they dont just come up as you scroll through twitter.

1 Like

It would be closer to being a host if it wasn’t for the feed. But because of the feed, they are determining what to show to you.


No. Im trying to create a distinction between “platforms”, like Twitter, and a “hosting service”, like Godaddy. We use a company called to host this forum. They should not be held responsible for the content here - we should be.

1 Like

Joseph, I think that your point about the feed is crucial to this discussion. A good test is this: who controls the stuff that you see in the “feed” on the platform? You? or the platform itself? The more they control it, the more they should be held responsible for the content.

Another way to think about it is that the hosting service is entirely unseen by the user. And the more they want to be seen by the user, the more they should be held responsible for the content.


I see. So the ones held responsible by governing authorities would be the Sanityville administration, not the individual poster. The individual is held responsible by the administration - by deleting, blocking, locking etc. There’s an hierarchical chain that in theory would extend to some government body, not necessarily federal.

I think there is already a chain back to the gov. If you break copyright law here, we admins will get notified by the offended party and told to do something. If we refuse to do anything, and/or successfully hide our identities, etc, then they can go to the gov. The gov will go to our hosting company. If they refuse to do something (and are in another jurisdiction) then the gov might take the domain name down. Etc.

@ldweeks, I think somebody needs to do a deeper explanation of what section 230 is, where it came from, why FB et al got immunity, and then why you think they shouldn’t.

1 Like

I’m interested in a deeper dive of section 230 too. Moving fences and all.

The large platforms (FB, Google, Twitter) use the “global public square” as their advertising engine. The architecture of these companies is built around the way they can operate now.

Lucas has some wonderful views on what would have happened if the immunity from section 230, in my opinion, had never been

The fact that the law provides this for copyright but not for smut tells you all you need to know about the US government’s priorities.


I would tread lightly here. I would hew more toward the ‘platform’ view that would prohibit providers from censoring speech.

I understand the frustration with what we now mash together and call “big tech”, and I do think that they are exploiting ambiguities to mainline a left-wing agenda - but making providers like Facebook a “publisher” does not fit the reality of what it is. The truth is that information media of 2020 cannot be placed neatly into categories defined for the 1950s.

1 Like

There’s also already a significant human cost when it comes to evaluating and approving content for these platforms. The “censorship farms” in places like the Philippines where it is the job of people to watch the things we don’t see on Facebook, Twitter, etc. to make sure they don’t get circulated - child pornography, extreme violence and murder, etc. sound like previews to hell and are a strong argument to my mind against the existence of any of these platforms at all.

1 Like

What sort of regulations are you imagining and who would be enforcing them?

I just don’t think having the US Government regulate social media is any better than having the platforms do it themselves. An AOC- or Bernie Sanders-led executive branch (and justice department) wouldn’t be any better than the company boards (as obnoxious as they are).

By this reasoning, what is the purpose of any just law?

I’m open to the argument that the bench needs cleansing before the law books, but that doesn’t seem to be the typical endpoint of this type of argumentation.

Mostly to punish evil or to reward good. 1 Pet 2:13-14 + Romans 13

Can anything but morality be legislated?

Somehow the quote I was responding to got lost in my context. It was supposed to be this:

No but that doesn’t mean that it’s wise for all immorality to be codified and regulated.

Again what sort of regulations are you imagining and who would be enforcing them?

I think the notion of having these platforms vet all of their users is wishful thinking. And we should be careful what we wish for.

It sounds like Lucas is simply arguing that some platforms be subject to the same rules as other media channels, that they lose their exception from existing regulations.

It’s a very tough question, because it’s very hard to figure out where and how to draw the distinctions. What exactly is a publisher vs a provider? It’s blurry.

Ultimately, having read the Wikipedia page, as well as skimmed the Washington Examiner post, I think it’s a bad idea to get rid of section 230 and the immunity it grants.

The result of removing it would be a hard line distinction between hosting companies that are “just dumb pipes” such as, which does not edit or moderate the content you post on your Wordpress site, and any site that attempted to have any sort of user-posting with moderation. The moment you attempt to moderate, you would become liable. Liable for what?

If somebody posted a comment slandering somebody else, the website they posted it on could be sued for slander. Unless that website never moderated anything at all. Because of bad actors, I think every site would have to either give up any sort of moderation, which would destroy a community like this, as it could be taken over by bad actors at any time, or hold everything for moderation and editing and approval prior to publishing it, which would also destroy communities, even this small.

On balance, I think that’s bad.

Now, I do think there are some changes that should be made to it. For example, if you are getting paid to post content on your site, I think you should be responsible for that content, but current court interpretation of the law even grants immunity on that, it seems.

1 Like

Trump appears to be taking Lucas’s position.