Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If you say "we are going to restrict ourselves [at the Government's insistence] to only moderating non-political speech" then someone will insist that some part of your non-political speech is actually political. And from some perspective they will be right.

Then you get to negotiate with the government and the courts about what they consider to be political speech or not. And suddenly you no longer live in a country that has a meaningful First Amendment.



Contrary to what people often claim, the government making any decision at all about what is and is not speech, or political speech, does not cause immediate free speech violations.

Yes, the government decides all the time, in a long standing and well studied legal history, about what is or is not speech.

In fact, I would go so far as to say that the primary purpose of the court system, with regards to free speech cases, is deciding what does or does not count as speech.


> And suddenly you no longer live in a country that has a meaningful First Amendment.

The overarching issue is that I don't see what the first amendment has to do with this at all. Corporations have zero obligations to anyone under the first amendment, which only applies to the government.


The intent of the first amendment was to preserve free speech in the public square; "what the first amendment has to do with this at all" is a desire to preserve that underlying societal principle.

Section 230 was a give-away to corporations that allowed them to privatize and co-opt the public square; they reap the economic benefit of being shielded from liability for what people post, but none of the responsibility to carry everyone's speech.

If we want to preserve free speech in the public square, some form of balance has to be restored, either by:

1. Eliminating the section 230 liability shield that allowed them to privatize public discourse in the first place.

2. Requiring privatized "public squares" to operate more like common carriers.

Solution (1) — eliminating section 230 — would likely make it infeasible to run a private website carrying public speech.

Texas seems to have attempted a nuanced version of solution (2): in exchange for being granted the privilege of being shielded from liability for the speech they carry, platforms of a size large enough to justify treating them as a "public square" must meet common-carrier-like requirements that prevent them from discriminating based on viewpoint.


> platforms of a size large enough to justify treating them as a "public square" must meet common-carrier-like requirements that prevent them from discriminating based on viewpoint.

This sounds so reasonable and nuanced until you reach this final phrase: "prevent them from discriminating based on viewpoint." Doing that requires the state to determine what speech is "un-biased, politically neutral speech" and which speech has a political "viewpoint." And under this law a government will determine this at the point of a gun, not the people operating individual websites.

(Because obviously if you implemented this in a truly content-neutral way along the lines that you suggest for a common carrier, every platform would be overrun by spam. So there has to be a decision about what constitutes biased speech, and that means political figures will be making determinations.)

This isn't theoretical. Nobody in this country even agrees on what "political/biased" speech is. Things that used to be broadly accepted by our whole society no longer are. I'm not going to go into examples, but you know exactly what they are. And you probably disagree with half of them. This law means that you won't get to decide, the state and the courts will.

If your problem is the over-concentration of social networking companies, I'm happy to agree. I urge you to support laws that reduce the power and userbase of these firms in a content-neutral way -- without placing the government in charge of what speech is "ok".


> Wyden, now a Senator, stated that he intended for Section 230 to be both "a sword and a shield" for Internet companies, the "sword" allowing them to remove content they deem inappropriate for their service, and the shield to help keep offensive content from their sites without liability. However, Wyden warned that because tech companies have not been willing to use the sword to remove content, they could be at risk of losing the shield.

https://en.wikipedia.org/wiki/Section_230#Platform_neutralit...

Given that hate speech (the context of that quote) is very often political and/or protected free speech, I don't think one of the two authors of the bill agrees with how you're interpreting it.


I’m claiming that section 230 is a fundamentally flawed construction that privatized the modern public square, while socializing the costs inherent in that privatization.

I’m not debating whether section 230 is functioning in accordance with its authors’ intentions; frankly, what they intended doesn’t matter.

Section 230 was a totally inappropriate give-away of unique privileges without requiring commiserate public value be returned in-kind, contrary to our long history of requiring that common carriers serve the public without discrimination in exchange for the unique privilege of being shielded from liability.


Ah.

I see where you're coming from, but I frankly believe that the internet as we know it is fundamentally unworkable without the ability for some platforms to restrict the voices of some users. It's also fundamentally unworkable if we make the platforms liable for the users words.

I'm up for solutions like breaking up the larger platforms so that users have more choice, but I consider 230 to be integral in mass user interaction.

To put it another way: I'm uninterested in an internet where websites are legally forced to either be sterile and userless or act like 4chan.


4chan is largely 4chan because :

- it's reputation is attracting trolls

(BTW, does 4chan even have the minimum 50M monthly users to qualify ?)

- it has that anonymous (rather than pseudonymous) social convention

I see a lot of "the sky is falling" reactions here, but even if large websites were prevented from banning content, nothing forbids them from allowing users themselves hide the (pseudonym-linked) content that they deem undesirable, a feature that is already widespread.

(Also, large platforms are not the Internet, and not even the Web, and I would be happy to see them gone.)


The intent of the first was specifically to keep the government from prohibiting your speech.

Social media sites are not the government -- they have their TOS like everybody else does. Love it or leave it.

It should also be noted that conservatives absolutely believe in banning speech they don't like. So it's not the principle of it, it's the power.


> which only applies to the government

Which only applies to the Federal government. States have had rules about speech since forever.

Texas wants platforms with > 50M users to treat political speech equally. That's well within a State's right to acquire that protection for its citizens.

Other states can choose different rules, or to leave their citizens with less rights, it's up to them.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: