Right now, companies like Facebook are not held responsible for the content the people post on their platforms. This is different from newspapers who are responsible for the content they publish. Typically, it is those on the political left who say that Big Tech should do more to police the content on their platforms. From time to time, Democrats talk about introducing new laws to require them to do it.
Those on the political right think Big Tech is doing too much moderation, and think that they are silencing conservative voices.
I think that the broad immunity granted to Big Tech platforms by Section 230 of the Communications Decency Act should be removed. I think Big Tech platforms should be held responsible for the content posted to their platforms. I think that the way the internet works right now is untenable. There is no such thing as a “global public square” - nor should we try to make one. It’s not even desirable.
But I think that holding a tech platform, such as your very own Sanityville, responsible for the content produced on it has numerous benefits:
- Platforms will be required to vet their users to a much greater degree. This is a good thing. People should have to prove themselves to have their voice amplified by some one else’s platform.
- Right now, we have roomfuls of people watching and reading the most horrific things imaginable all day every day in order to censor them from Youtube and Facebook and the like. This, also, is simply untenable. Again, I think holding the tech platforms responsible, themselves, for the content will make them much less willing to allow people to post. This is a good thing.
- If you provide and “own” a digital space, you should be held responsible for the condition of that space.
- If Facebook is held responsible for what you post, they will make that fact abundantly clear to you at every turn. It will become clear that you are there as a guest. I think this is good, because, right now, they make you feel like you own the place. Like you are in charge of your own little section of Facebook. But you don’t, and you aren’t. Let’s make that obvious.
- I think this would motivate platforms to make their own positions more clear, and to be more open about why they censor what they do. Again, this is all to the good.
- I suspect that this would result in more, smaller platforms out there. I like that idea.
It is said that Section 230 of the Communications Decency Act “created the internet”. Otherwise, it is argued, the Big Tech platforms with lots to lose would have been overly cautious about what they allowed to be posted, and many wouldn’t have even bothered to allow the teeming masses to post.
Well, I’ve seen the internet as we currently have it, and I just don’t think it works.
Disagreements would still be allowed to occur on any given platform. The New York Times or the Wall Street Journal should both post opposing viewpoints, and their should be a healthy back-and-forth.
And, of course, you should be able to have your own blog and post whatever you want. That’s fine. But you won’t be amplified by someone else’s platform. The hosting service you use, and your domain name registrar, should not be held responsible for any of your content.
What do you think?