Elon Musk now owns 9.2% of twitter...update.. will soon be the sole owner as Board of Directors accepts his purchase offer

Page 233 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Zorba

Lifer
Oct 22, 1999
15,282
10,879
136
Disagree, imagine what the moderation would have to be like here if Future could be sued by everyone who thought they were wronged by a post.
Future does not actively promote any posts, your example completely misses the point. Plus offending someone isn't a crime.

However, it's bullshit that Twitter or YouTube could take a post that told a false story with the takeaway being "kill all the Jews," push it to millions of people, including people they know are predisposed to anti-Semitic material, and then hide behind "we didn't create the post, no liability, hahaha."
 

Zorba

Lifer
Oct 22, 1999
15,282
10,879
136
The answer is there would be no moderation as they would shut the site down.
It's not black and white, there is something between no user created content and complete freeforall to radicalization, promotion of violence, and hate.
 

MrSquished

Lifer
Jan 14, 2013
23,155
21,281
136
Who here called Elon the piece of shit he is, before it became fashionable?

I think it's something squishy.
 

Zorba

Lifer
Oct 22, 1999
15,282
10,879
136
What is that in between?
Umm, if you promote material that leads to harm, that a reasonable person could see would lead to harm, you can be held liable. Pretty much the way it works everywhere except the internet. See the current lawsuits against Fox News.

However, under Section 230, Facebook could've pushed all the same BS to far more people, even more systematically, with absolutely zero liability.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,340
5,464
136
It's not black and white, there is something between no user created content and complete freeforall to radicalization, promotion of violence, and hate.

That something between is Section 230.

It allows sites to operate without fear that something their users do could bankrupt them when held responsible.
 

Zorba

Lifer
Oct 22, 1999
15,282
10,879
136
That something between is Section 230.

It allows sites to operate without fear that something their users do could bankrupt them when held responsible.
No, section 230 allows sites to push whatever harmful content they want to with zero fear of liability. That is not an in-between. It allows internet companies to cause extreme harm with absolutely no fear of liability.

This basically the same thing as "regulating guns is hard, therefore we can do anything." Or any other number of issues people act like are too complicated to deal with.
 

MrSquished

Lifer
Jan 14, 2013
23,155
21,281
136
No, section 230 allows sites to push whatever harmful content they want to with zero fear of liability. That is not an in-between. It allows internet companies to cause extreme harm with absolutely no fear of liability.

This basically the same thing as "regulating guns is hard, therefore we can do anything." Or any other number of issues people act like are too complicated to deal with.

Without section 230 conservatives could easily use the law to shut down any content posted by users about abortion, about being gay, about anything - by the government. It would be insane. It would open the door to the most insane censorship by the right wing we have ever seen. Of course they will allow all their hateful speech to go unchecked.
 

Zorba

Lifer
Oct 22, 1999
15,282
10,879
136
Without section 230 conservatives could easily use the law to shut down any content posted by users about abortion, about being gay, about anything - by the government. It would be insane. It would open the door to the most insane censorship by the right wing we have ever seen. Of course they will allow all their hateful speech to go unchecked.
Section 230 doesn't apply to anything except internet companies. I see all of those subjects covered by traditional media all the time. Putting reasonable limits on Section 230 has literally nothing to do with government censorship.

Right now: Fox News hosts lied about Dominion on air and caused significant damage to their business and will likely be held liable and have to pay damages. If instead they just took posts written by others saying the exact same things and blasted them all over their website, they'd have no liability. What is the logic for that?

A girl convinced her boyfriend to commit suicide, she was charged and convicted. Various internet companies push pro-suicide messages to at risk teens all the time, in fact they seek them out to push the content to, and they have zero liability.

Why should the internet have no liability when they purposely, through their own actions, promoting material they know will cause harm? I'm not talking about making companies liable for content posted by others, I am talking about how they should be held liable for their business choices to push that content regardless of the potential harm. It's like a random guy being handed loaded guns and turning around and giving them to kids and say "I have no liability, I didn't load the gun."
 
  • Like
Reactions: VirtualLarry

MrSquished

Lifer
Jan 14, 2013
23,155
21,281
136
Section 230 doesn't apply to anything except internet companies. I see all of those subjects covered by traditional media all the time. Putting reasonable limits on Section 230 has literally nothing to do with government censorship.

Right now: Fox News hosts lied about Dominion on air and caused significant damage to their business and will likely be held liable and have to pay damages. If instead they just took posts written by others saying the exact same things and blasted them all over their website, they'd have no liability. What is the logic for that?

A girl convinced her boyfriend to commit suicide, she was charged and convicted. Various internet companies push pro-suicide messages to at risk teens all the time, in fact they seek them out to push the content to, and they have zero liability.

Why should the internet have no liability when they purposely, through their own actions, promoting material they know will cause harm? I'm not talking about making companies liable for content posted by others, I am talking about how they should be held liable for their business choices to push that content regardless of the potential harm. It's like a random guy being handed loaded guns and turning around and giving them to kids and say "I have no liability, I didn't load the gun."

Why your dismissal of internet companies? That is where so much discourse is held today. And where people can exchange thoughts. It's massive. What you are proposing opens the door for conservatives to censor all non right wing content published by any user on an internet company's website. That's dangerous.
 

Zorba

Lifer
Oct 22, 1999
15,282
10,879
136
Why your dismissal of internet companies? That is where so much discourse is held today. And where people can exchange thoughts. It's massive. What you are proposing opens the door for conservatives to censor all non right wing content published by any user on an internet company's website. That's dangerous.
Section 230 only applies to internet companies, that is why I am talking about it's affect on them. I am not dismissing them. I am against waiving liability of an entire industry just to help their profits, though.

Can you explain how Section 230 currently prevents government censorship? Or how allowing internet companies be held liable for their business decisions and actions, just like every other type of company in the country, would lead to censorship, which has not occurred to any of those other companies?

Right now Section 230 is the equivlant of saying "No shooter or user of a firearm shall be treated as the shooter of any gun provided by another gun owner." When internet companies decide what content to publish wear, and how to very specifically target it, they are no longer a neutral party they are acting as a publisher or editor. Just as a person handed a loaded gun makes a personal choice to squeeze the trigger.

But I know none of the three of you would support unlimited immunity from corporate harm in any other industry, not sure why social media is such a special case.
 

Paratus

Lifer
Jun 4, 2004
17,131
14,506
146
Section 230 only applies to internet companies, that is why I am talking about it's affect on them. I am not dismissing them. I am against waiving liability of an entire industry just to help their profits, though.

Can you explain how Section 230 currently prevents government censorship? Or how allowing internet companies be held liable for their business decisions and actions, just like every other type of company in the country, would lead to censorship, which has not occurred to any of those other companies?

Right now Section 230 is the equivlant of saying "No shooter or user of a firearm shall be treated as the shooter of any gun provided by another gun owner." When internet companies decide what content to publish wear, and how to very specifically target it, they are no longer a neutral party they are acting as a publisher or editor. Just as a person handed a loaded gun makes a personal choice to squeeze the trigger.

But I know none of the three of you would support unlimited immunity from corporate harm in any other industry, not sure why social media is such a special case.
I suggest reading up on the formative cases from the 90’s that caused Section 230 to be created in the first place.


Without Section 230 @Perknose moderating these forums would open up Future to be sued as a publisher.

Without Perk moderating, these forums would be overwhelmed with spam and porn and other nastiness.

I understand your concern but there’s not a lot of maneuvering room to make changes without sliding down one side or another.
 

Zorba

Lifer
Oct 22, 1999
15,282
10,879
136
I suggest reading up on the formative cases from the 90’s that caused Section 230 to be created in the first place.


Without Section 230 @Perknose moderating these forums would open up Future to be sued as a publisher.

Without Perk moderating, these forums would be overwhelmed with spam and porn and other nastiness.

I understand your concern but there’s not a lot of maneuvering room to make changes without sliding down one side or another.
I know about the case and I understand why it was written. It was written at time there were less monthly users on the internet in the US than daily users on Facebook today. It was a simple, one sentence law that has have major ramifications for the good and the bad. There is absolutely no reason it can not be modified/adjusted to be a bit more sophisticated than it currently is. Every other technology we modify the laws regulating them as we go. Commercial aviation isn't held to the same laws that were passed in 1920, in fact new regulations come out every day on commercial aviation. Saying that Section 230 should never been adjusted for modern tech and use cases, is right up there with saying the second amendment should guarantee all arms, doesn't matter that there were no machine guns or nuclear bombs when it was written, any one can have any arm anytime they want it.

The options are not no moderation and no liability or some moderation and infinite liability. There are adjustments that can be made, especially when we start talking about companies selecting third party content and promoting it. No other industry has anywhere near this liability shield. I have never said to eliminate Section 230, I have said it needs to be adjusted so that when companies start making their own business decisions to act as a promoter and publisher, they should be treated as such. Again if Facebook would've pushed the same Dominion conspiracy as Fox News using nothing but third party posts, they would have zero liability, no matter how many damning e-mails were uncovered. If you give me a loaded gun and I shoot up a mall with it, should I have no liability because the gun wasn't mine?

Again, my issue is allowing companies to have unlimited use of third party content with no liability. This is not content moderation, it is content promotion and publishing. Completely different than the Prodigy case and completely different than how Anandtech moderation works.
 

A///

Diamond Member
Feb 24, 2017
4,351
3,158
136
Who here called Elon the piece of shit he is, before it became fashionable?

I think it's something squishy.
Being from across the pond originally and going to school with the kids of families who immigrated to the UK long before anyone considered ending apartheid all I can say is I never and still don't like white South Africaners. They're all racist cunts deep down. They don't change and keep the racism inside their families.

On the positive side they got their shit kicked in daily until the end of term. IDK what happened to them but they possibly moved. Ironically I was constantly annoyed, toyed with and mildly bullied for my English accent when we as a family moved to America. I quickly learned to unlearn that accent and take on an American one. Still comes out sometimes for a peak a boo.

My parents wouldn't be as kind as I am here to white South Africaners. Their generation regardless of social standing don't like them. Their generation had their own racist quips or concepts, but apartheid was terrible beyond anything anyone could imagine in those days. It makes cross burning a fucking joke.
 

Heartbreaker

Diamond Member
Apr 3, 2006
4,340
5,464
136
Right now Section 230 is the equivlant of saying "No shooter or user of a firearm shall be treated as the shooter of any gun provided by another gun owner." When internet companies decide what content to publish wear, and how to very specifically target it, they are no longer a neutral party they are acting as a publisher or editor. Just as a person handed a loaded gun makes a personal choice to squeeze the trigger.

But I know none of the three of you would support unlimited immunity from corporate harm in any other industry, not sure why social media is such a special case.

No, right now it's the equivalent, of not holding gun manufacturers responsible for the murders their customers commit.

If there is anything actionable on a website, people can still go after the actual person that created the actionable content, not the platform.

I'm not sure why don't get the concept of not holding a platform liable for what it's users do.
 
Last edited:

fskimospy

Elite Member
Mar 10, 2006
85,710
51,000
136
I know about the case and I understand why it was written. It was written at time there were less monthly users on the internet in the US than daily users on Facebook today. It was a simple, one sentence law that has have major ramifications for the good and the bad. There is absolutely no reason it can not be modified/adjusted to be a bit more sophisticated than it currently is. Every other technology we modify the laws regulating them as we go. Commercial aviation isn't held to the same laws that were passed in 1920, in fact new regulations come out every day on commercial aviation. Saying that Section 230 should never been adjusted for modern tech and use cases, is right up there with saying the second amendment should guarantee all arms, doesn't matter that there were no machine guns or nuclear bombs when it was written, any one can have any arm anytime they want it.

The options are not no moderation and no liability or some moderation and infinite liability. There are adjustments that can be made, especially when we start talking about companies selecting third party content and promoting it. No other industry has anywhere near this liability shield. I have never said to eliminate Section 230, I have said it needs to be adjusted so that when companies start making their own business decisions to act as a promoter and publisher, they should be treated as such. Again if Facebook would've pushed the same Dominion conspiracy as Fox News using nothing but third party posts, they would have zero liability, no matter how many damning e-mails were uncovered. If you give me a loaded gun and I shoot up a mall with it, should I have no liability because the gun wasn't mine?

Again, my issue is allowing companies to have unlimited use of third party content with no liability. This is not content moderation, it is content promotion and publishing. Completely different than the Prodigy case and completely different than how Anandtech moderation works.
Do I understand you correctly then that if a company uses any algorithm to curate or promote certain content to users that they would then be liable for that content? That would turn YouTube into an FTP server.
 

Perknose

Forum Director & Omnipotent Overlord
Forum Director
Oct 9, 1999
46,574
9,956
146
Do I understand you correctly then that if a company uses any algorithm to curate or promote certain content to users that they would then be liable for that content? That would turn YouTube into an FTP server.
^^^ Reductio ad abusrdam. Wouldn't you agree that algorithms that Facebook et al seem to use have lead naive users down dangerous rabbit holes? Shouldn't something be done?
 

fskimospy

Elite Member
Mar 10, 2006
85,710
51,000
136
^^^ Reductio ad abusrdam. Wouldn't you agree that algorithms that Facebook et al seem to use have lead naive users down dangerous rabbit holes? Shouldn't something be done?
I don’t think that’s a reduction to absurdity.

There are only so many hours in a day so Fox News has humans making the decisions for everything that goes on there. On the internet the volume of information is so vast it can only be done algorithmically. Algorithms by definition promote some content over others and since no algorithm is perfect any attempt to curate content would open companies to massive liability. They would just remove all curation and become dumb pipes, shut down, or move overseas.

I think people have too thin a line between ‘this is bad’ and ‘the government should punish people for this’. I don’t see any prospect for weakening section 230 that wouldn’t lead to bad consequences. In fact I think it would be a lot like Brexit where everyone wakes up and is suddenly very sad they got their wish.
 
  • Like
Reactions: Ajay and Fenixgoon

Zorba

Lifer
Oct 22, 1999
15,282
10,879
136
No, right now it's the equivalent, of not holding gun manufacturers responsible for the murders their customers commit.

If there is anything actionable on a website, people can still go after the actual person that created the actionable content, not the platform.

I'm not sure why don't get the concept of not holding a platform liable for what it's users do.
I'm all for not holding the platform liable when it's actually a neutral/providing simple moderation. Facebook, YouTube, etc. are not doing that, they are taking that content and actively promoting. They've desired their websites to promote harmful material for profits.

Just like a company dumping chemicals into the river and then saying "we didn't make the chemicals, not our fault."

How do you not understand the difference in simple moderation and actively taking known harmful content and pushing it to known suspectable individuals? Like actively promoting suicide posts to depressed teens?
 

Zorba

Lifer
Oct 22, 1999
15,282
10,879
136
Do I understand you correctly then that if a company uses any algorithm to curate or promote certain content to users that they would then be liable for that content? That would turn YouTube into an FTP server.
I believe there is something between FTP and serving up ISIS recruitment videos with no liability. Especially when it isn't related to anything that person follows. Why should Fox News be liable when they produce the content, but not when the purposefully select, curate, and amplify someone else's content to achieve the same objective?

But these companies know what they are doing, they know they are promoting violence to people that like violence, they know they are promoting suicide to depressed teens, they know they are promoting guns to angry teens. But they don't care because it gets them more eyeballs and they have zero liability for their corporate actions.
 
Feb 4, 2009
35,284
16,766
136
Future does not actively promote any posts, your example completely misses the point. Plus offending someone isn't a crime.

However, it's bullshit that Twitter or YouTube could take a post that told a false story with the takeaway being "kill all the Jews," push it to millions of people, including people they know are predisposed to anti-Semitic material, and then hide behind "we didn't create the post, no liability, hahaha."
In principle I agree. Too many sites have no skin in the game and allow outrageous content. Not sure if the proper solution and I agree it needs to be discussed.
 
  • Like
Reactions: Zorba

fskimospy

Elite Member
Mar 10, 2006
85,710
51,000
136
I believe there is something between FTP and serving up ISIS recruitment videos with no liability. Especially when it isn't related to anything that person follows. Why should Fox News be liable when they produce the content, but not when the purposefully select, curate, and amplify someone else's content to achieve the same objective?

But these companies know what they are doing, they know they are promoting violence to people that like violence, they know they are promoting suicide to depressed teens, they know they are promoting guns to angry teens. But they don't care because it gets them more eyeballs and they have zero liability for their corporate actions.
Because Fox is speaking there, not someone else.

I hear your point, I just don’t know how you would implement it without disastrous effects.
 

fskimospy

Elite Member
Mar 10, 2006
85,710
51,000
136
I'm all for not holding the platform liable when it's actually a neutral/providing simple moderation. Facebook, YouTube, etc. are not doing that, they are taking that content and actively promoting. They've desired their websites to promote harmful material for profits.

Just like a company dumping chemicals into the river and then saying "we didn't make the chemicals, not our fault."

How do you not understand the difference in simple moderation and actively taking known harmful content and pushing it to known suspectable individuals? Like actively promoting suicide posts to depressed teens?
How would you define neutrality in a way that wouldn’t be abused?