Now Reddit is being unfair to the President

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Amused

Elite Member
Apr 14, 2001
57,347
19,503
146
That's not the case at all. Again - Do YOU as a website that allows for any public content want to be LIABLE for your content... or not liable? It's very simple.

Anyone with half a brain should say "Not liable" - and thus 4chan, Reddit, Facebook, etc... would be fine to continue - just without selectively policing content that random employees disagree with.

They answer to their advertisers.

This is the very important part you're missing. This is a social pressure business decision, not a moral nor 1st amendment issue.

It is also a private property issue.

Your argument is moot, which is why it is not the basis of any successful legal case. Liability is not the issue. Full stop.

So stop making it, ok?
 

Paratus

Lifer
Jun 4, 2004
17,633
15,820
146
That's not the case at all. Again - Do YOU as a website that allows for any public content want to be LIABLE for your content... or not liable? It's very simple.

Anyone with half a brain should say "Not liable" - and thus 4chan, Reddit, Facebook, etc... would be fine to continue - just without selectively policing content that random employees disagree with.

I don’t think you understand what the impacts actually are. It’s not that I’m worried about 4chan disappearing. It’s that removing section 230 protections means all forums and social media becomes 4chan or gets shutdown.

Checkout this article on the origins of section 230.
https://arstechnica.com/tech-policy...net-law-politicians-love-to-hate-explained/3/
To understand Section 230, you have to understand how the law worked before Congress enacted it in 1996. At the time, the market for consumer online services was dominated by three companies: Prodigy, CompuServe, and AOL. Along with access to the Internet, the companies also offered proprietary services such as realtime chats and online message boards.

Prodigy distinguished itself from rivals by advertising a moderated, family-friendly experience. Employees would monitor its message boards and delete posts that didn't meet the company's standards. And this difference proved to have an immense—and rather perverse—legal consequence.

Advertisement
In 1994, an anonymous user made a series of potentially defamatory statements about a securities firm called Stratton Oakmont, claiming on a Prodigy message board that a pending stock offering was fraudulent and its president was a "major criminal." The company sued Prodigy for defamation in New York state court.

Prodigy argued that it shouldn't be liable for user content. To support that view, the company pointed back to a 1991 ruling that shielded CompuServe from liability for a potentially defamatory article. The judge in that case analogized CompuServe to a bookstore. The courts had long held that a bookstore isn't liable for the contents of a book it sells—under defamation, obscenity, or other laws—if it isn't aware of the book's contents.

But in his 1995 ruling in the Prodigy case, Judge Stuart Ain refused to apply that rule to Prodigy.

Advertisement
"Prodigy held itself out as an online service that exercised editorial control over the content of messages posted on its computer bulletin boards, thereby expressly differentiating itself from its competition and expressly likening itself to a newspaper," Ain wrote. Unlike bookstores, newspapers exercise editorial control and can be sued any time they print defamatory content.

The CompuServe and Prodigy decisions each made some sense in isolation. But taken together, they had a perverse result: the more effort a service made to remove objectionable content, the more likely it was to be liable for content that slipped through the cracks. If these precedents had remained the law of the land, website owners would have had a powerful incentive not to moderate their services at all. If they tried to filter out defamation, hate speech, pornography, or other objectionable content, they would have increased their legal exposure for illegal content they didn't take down.
You are suggesting we go back to the bad old days. Take AT for example. They want to use the technical forums as a draw for eyeballs to the site. Under your paradigm if AT exercises any control over what’s posted in the forums they would be liable for any defamatory or illegal postings.

So Anandtechs choice would be to allow anyone to post whatever they want. Which would absolutely include 4chan style posts when hardware fan boys start fake linking goatse, porn, Pepe, etc during flamewars. That behavior will drive away views and business. The other option which most businesses will take is shutting down forums and article comments.
 
  • Like
Reactions: ch33zw1z and FaaR

ElFenix

Elite Member
Super Moderator
Mar 20, 2000
102,402
8,572
126
It's actually pretty simple -

Are these websites acting as a utility/platform - where they are free and clear of being liable for the content posted. Similar to telephone companies, they aren't liable if someone uses their services to post a video of a killing, etc. Likewise, they aren't able to police their services.

The opposite of that is a publisher. They are free to police their content as they see fit - but they are also liable for the information that they so choose to publish.


So which is it? You can't have it both ways.


you know that we had an act of Congress about 24 years ago that actually changed all those things with regard to internet sites right?
 
  • Like
Reactions: DarthKyrie

zinfamous

No Lifer
Jul 12, 2006
111,851
31,343
146
"They posted things that I don't agree with! Therefore it's evil!"

All this selective censorship is going to come and bite social media's ass one of these days.

or it was really just a breeding ground of fucking Nazis.

that Hitler guy: "You jsut disagree with what he said! Waaaaaaaaa!"

unread, uneducated simpletons like you continually refuse to learn the actual lessons from history. Trumpland is in late 1920s/early 30s era of fascism rising: point-by-observable, plain-point. No fucking question. But you refuse to see. You refuse to listen. You, like other dumb low IQ humans, think that history is erased if you aren't living in it. All that matters in war is the good guys win! Not how we got there. Not that there are never rules with who should or has to win. You don't fucking care because you haven't shed a single ounce of flesh for the fucking freedoms you enjoy--none of which you understand at all.
 
  • Like
Reactions: DarthKyrie

Jhhnn

IN MEMORIAM
Nov 11, 1999
62,365
14,685
136
That's not the case at all. Again - Do YOU as a website that allows for any public content want to be LIABLE for your content... or not liable? It's very simple.

Anyone with half a brain should say "Not liable" - and thus 4chan, Reddit, Facebook, etc... would be fine to continue - just without selectively policing content that random employees disagree with.

I'm pretty sure the decision was made at the highest levels of management. It's not an issue of liability, anyway, but one of common decency & mutual respect, and of platform provider image. It's their forum & they can do as they like with it, anyway. There are plenty of other places for the wackos to spread their poison.
 
  • Like
Reactions: DarthKyrie

McLovin

Golden Member
Jul 8, 2007
1,915
58
91
Strawman debates/arguments are fun.

Just because you have the freedom to say something, doesn't mean that there are not consequences for that which you say.

Your Yankees analogy? Holy f***ing balls batman, comparing that to the strife of minorities/oppressed people is quite the stretch.


Also, just so I am clear here, f*** the Yankees.
 

fskimospy

Elite Member
Mar 10, 2006
87,935
55,288
136
That's not the case at all. Again - Do YOU as a website that allows for any public content want to be LIABLE for your content... or not liable? It's very simple.

Anyone with half a brain should say "Not liable" - and thus 4chan, Reddit, Facebook, etc... would be fine to continue - just without selectively policing content that random employees disagree with.
Section 230 absolutely permits websites to remove content they find objectionable, just as it should be.

Banning the_donald is only notable in how overdue it was.
 
  • Like
Reactions: DarthKyrie