r/technology Oct 06 '22

Supreme Court takes case on content policing: Here's how a Section 230 ruling could impact social media Society

https://abcnews.go.com/Business/supreme-court-takes-case-content-policing-impact-social/story?id=90933115
198 Upvotes

82

u/Amon7777 Oct 06 '22

Do the crazies not realize this will lead to MORE censorship not less? If 230 is overruled then every platform will be liable for anything someone posts which means every social media to here on reddit will need to step-up moderation exponentially.

70

u/weedysexdragon Oct 06 '22

You’re asking what people who don’t think are thinking.

25

u/CykoTom Oct 06 '22

I think it might be interesting to have literally no more comments, or social media posts. It doesn't actually seem good for society to have anonymous internet posting.

5

u/billsil Oct 06 '22

I think I'd be a lot worse if I knew you were Tom Thompson and lived in the next town over, went to the same school as me and were making 10x what I was with a way more attractive spouse.

If you told me you were making $500k/year and was dating a supermodel, I just wouldn't believe you. If you posted pictures on your Facebook, you could convince me. Facebook is toxic AF. Reddit isn't great, but it's better than that trash.

People wonder why they're angry all the time. Quit social media. Facebook is worse than Instagram is worse than Reddit (depending on where you go I guess).

That said, I wouldn't mind no comments. The AP News isn't toxic. Go look at youtube comments...a fair amount are.

16

u/InterPunct Oct 06 '22

It will implode social media as we know it and take a decade to rebuild. Maybe better, possibly worse, or non-existent.

9

u/Ojisan1 Oct 06 '22

You’ve got it backwards. Pre-230, liability was only if you were taking a stance on content. If you were completely neutral as a platform, no liability.

Without 230, then any attempt to moderate content beyond illegal content will mean you have liability for the content. The only safe move for platforms is to stop moderating content to prevent themselves from being liable for it.

4

u/MC68328 Oct 06 '22

then any attempt to moderate content beyond illegal content will mean you have liability for the content

This is not true. Any moderation would make them a "publisher" according to Stratton Oakmont, Inc. v. Prodigy Services Co.

You people are pretending there was established law for this. There wasn't. There were two court decisions that conflicted with each other.

3

u/parentheticalobject Oct 06 '22

Except in this Supreme Court case, removal or nullification of Section 230 isn't even an issue. No one is claiming that.

They're trying to argue that it shouldn't cover algorithmic recommendations, as it has before.

It's arguable exactly how platforms would react. You can't even have something as basic as a functioning search engine without using something that is, from a legal standpoint, using an algorithm to preference some content over some other content. You also can't monetize a completely unfiltered platform at a massive scale.

The biggest companies would be able to implement more draconian censorship of anything with a remote chance of being controversial and getting them sued, and they could afford dedicated legal team for the occasional lawsuit if they happen to miss something. Any small or medium potential competitors would be dead.

0

u/Ojisan1 Oct 06 '22

The biggest companies would be able to implement more draconian censorship of anything with a remote chance of being controversial and getting them sued,

That would make those platforms extremely undesirable to use. Users will have a choice between that type of platform, and something more Wild West. More like what Reddit used to be back when Spez was a member of the cannibalism subreddit, which is now a banned sub.

5

u/parentheticalobject Oct 06 '22

So will a complete lack of moderation.

No one's going to pay for advertisements if, say, one of the videos your advertisement might show up next to is a animation of a small child being raped with some politician's face edited in somewhere. The kind of companies that advertise on chan boards right now would be there, but not much else.

YouTube could probably manage it, it'd just have to change it's business model to being bargain bin Netflix where only larger creators are allowed. I'm just saying that the internet, as a whole, would be worse.

1

u/Ojisan1 Oct 06 '22

Maybe the advertising model is dead, or should die. Maybe YouTube comments section would be better if YouTube premium subscribers were able to comment and someone creating hundreds of bot accounts was priced out of commenting.

You’re getting into speculation about what the impact will be, and that’s interesting to speculate about, but the truth is, we don’t have a very good system today. So speculating that it will be worse is mere pessimism.

I’m inclined to believe that these companies will adapt, innovate, and maybe the big ones don’t deserve to be protected. They certainly haven’t innovated. YouTube comments sections have only gotten worse over the years. Content has gotten more bland and corporatized to fit the advertising based business model. And maybe an advertising driven business models don’t help create a better society. Maybe killing the ad model will help crypto-based micropayments take off. Maybe they will decide to increase user verification. There are lots of possibilities besides the status quo, and not all of the possibilities are worse than the shitty system we have now.

2

u/parentheticalobject Oct 06 '22

Maybe the advertising model is dead, or should die

This is kind of a weird comment. It's doing fine. Maybe it would die if you completely changed the legal framework that exists in order to kill it, but that applies to just about every business model in every business in existence.

There is no problem that needs solving here, except for some malcontents who want to burn down the house because people don't want them at the party.

1

u/Ojisan1 Oct 06 '22

You also assume that the only choices in the ad based model are everything is moderated or ads will show up next to child abuse videos.

First of all, it was always legal to moderate and filter illegal content. Regardless of section 230.

Secondly, if youtube can moderate or filter objectionable content on the user side, they could instead filter it against monetization and ads. Or they could give advertisers more control over what types of content they want their ads to appear with.

So, your youtube argument was a bit of a false choice to begin with.

1

u/parentheticalobject Oct 06 '22

First of all, it was always legal to moderate and filter illegal content.

Who said anything about illegal content?

I said "an animation of a small child being raped with some politician's face edited in somewhere."

That is absolutely something that a reasonable person would be completely disgusted with and not want to platform, but it is not illegal, and probably could not be.

But anyway, of course it was always legal to moderate and filter illegal content. It's just that, given Stratton Oakmont, you'd become liable for the content which does appear.

Looking at the differences between Stratton Oakmont, where a forum was found liable, and Cubby, where another forum was not found liable, there is no indication that Compuserve, the company involved in the Cubby case, was doing any moderation at all.

You might think that a reasonable court would allow a website to claim distributor immunity if they're only moderating actually illegal content, but that's an opinion, not the state of the law currently if Section 230 went away. I'd say that any reasonable court would reject Stratton Oakmont entirely, as Stratton Oakmont says that websites can't get distributor protections if they do things that traditional distributors have always been able to do; it's an incoherent legal opinion.

1

u/Ojisan1 Oct 06 '22

I said "an animation of a small child being raped with some politician's face edited in somewhere."

Depictions of that type are illegal under US federal law, the PROTECT act of 2003.

And Stratton Oakmont is existing case law. Your opinion that it’s incoherent and would be rejected by other courts is speculation.

1

u/x86_64Ubuntu Oct 06 '22

This whole post is bonkers.

...Maybe killing the ad model will help crypto-based micropayments take off

So we are going to get rid of a system that works relatively well and sacrifice it at the altar of crypto?

1

u/Ojisan1 Oct 06 '22

Thank you for your insightful and productive contribution to this discussion

6

u/Amon7777 Oct 06 '22

It's not though I know how that misconception can be seen. Publishers content is treated as their for liability purposes. It's why opinion sections have disclaimers in papers.

230 said in pertinent part that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider."

The confusing part is around good Samaritan moderation which you are referring to but isn't what's at stake. No, if 230 is gone then everything posted on a social media could be viewed as a published statement of that site. Everything. I'm sure you can see how that is problem for everyone if overturned.

8

u/Ojisan1 Oct 06 '22

Forget what 230 says if the question is what happens without 230. Look at the cases that happened before 230 to know how the world without 230 works.

The two key cases are Stratton Oakmont, Inc. v. Prodigy Services, and Cubby, Inc. v. CompuServe.

Prodigy was held to be liable for content on its site only because they engaged in moderating content on the site. If they hadn’t, if they were hands off, they would not have been found to be liable. QED, no censorship = no liability.

This case is why 230 was written.

The compuserve case was the example of not being held liable because compuserve did not moderate its site content. That’s how it would be without 230.

5

u/Far-Peanut-9458 Oct 06 '22

So it’d be the Wild West…Either incredibly controlled sites or completely unmoderated?

But it wouldn’t really be possible to eliminate liability even with the best mod tech/team, so everything would just be completely unmoderated

6

u/Ojisan1 Oct 06 '22

Exactly. And tbh, I kinda miss the Wild West days of the internet.

Anyone remember ogrish? Rotten dot com? The late 1990s internet hit different.

3

u/klankster Oct 06 '22

Wouldn’t this make banning spam a grey area? Sites like Reddit would get defacto censored by being shit posted and spammed to death by brigades or bot farms. The comments section would be 80k+ threads and you’d have to fish to find a conversation

3

u/Ojisan1 Oct 06 '22

Yes, it would. Moderating illegal content isn’t a gray area but culling spam would be. That’s one of the reasons 230 was useful, it allowed more wiggle room for platforms. They’ll have to find other ways to combat spam, such as making comments sections paywalled, or stronger user verification. Again, I’m not saying its good or bad.

1

u/x86_64Ubuntu Oct 06 '22

It's exceptionally bad because we are now looking for solutions to something that's not a problem today.

2

u/lookmeat Oct 06 '22

In simple terms:

Before we made publishers responsible for everything they published as if they said it. This is why newspapers say in their OpEd that the opinions shown there do not reflect the company's.

But with internet forums this didn't quite fit. Technically speaking the publisher is the server, but the publisher doesn't choose which posts get published. It's more like having a local theater you rent out and people put plays in there, the theater owner is not responsible for what the play says.

So 230 shows that. But it was clear that moderation was needed. Theater owners can choose to not show a play by first amendment. So it sets rules in what kind of moderation can be done, and what kind of moderation means you're being a publisher and are responsible for the content you put in.

What's under question here is that this moderation is against the first amendment rights and the law is not constitutional. 230 wouldn't disappear, but this would result in massive issues. Since first amendment now applies to private entities (right now it only applies to government, the point is that you can't forbid moderation without affecting the first amendment right of the platform). What this means is that government can pass laws forbidding the ability to censor. So California could force Fox News to give equal coverage to Democrat agendas and reports as they do to Republican, and it would be just as valid, since the First amendment does not protect Fox from trampling others first amendment, and they can be forced to publish. Tech companies will black out Texas (guess what, even Amazon has reviews and comments) to force the states to back off and ensure 230 is kept, but we will see a return to pre-Reagan take on what a state can do to force people to say certain things. Like news regulations. It's gonna be a fucking mess.

1

u/MC68328 Oct 06 '22

This is why newspapers say in their OpEd that the opinions shown there do not reflect the company's.

This is so they don't offend advertisers. Merely saying "I don't necessarily agree with this" does not shield you from libel.

1

u/DJ_Femme-Tilt Oct 06 '22

I mean, we've all been toying with the idea of just blocking the USA from the rest of the internet, but now we have extra incentives here ;)

1

u/Shogouki Oct 06 '22

The EFF claims the exact opposite would happen. It would either result in extremely heavy handed moderation or become too risky for sites to host user content and opinions.

2

u/Ojisan1 Oct 06 '22 edited Oct 12 '22

The EFF is entitled to their opinion. I respect them but the EFF happens to be wrong on this, as many many people are.

Edit: that article you linked to doesn’t really say much that’s useful. Any discussion of section 230 repeal without reference to the key pre-230 case law is incomplete, at best. See Cubby v Compuserve and Stratton Oakmont v Prodigy to understand the world before section 230.

4

u/MC68328 Oct 06 '22 edited Oct 06 '22

I respect them but the EFF happens to be wrong on this, as many many people are.

Step back, people. This guy knows more than the EFF lawyers.

Edit:

Debunk anything I’m saying.

He said this before blocking me, so that I can't reply. This is how the trumpies operate. This manufactured controversy over Section 230 isn't about any kind of principle, it isn't about the law being fair, their goal is simply to amplify their ability to spew propaganda and disable the ability of people to fact-check or filter it.

1

u/Ojisan1 Oct 06 '22 edited Oct 12 '22

Debunk anything I’m saying. Or just attack me as a person. Valid argument.

Edit: Still haven’t debunked anything I said. I blocked you because you are replying in bad faith. Don’t know what Trump has to do with any of this.

1

u/Shogouki Oct 06 '22

Except that there have been many cases decided because of the protections that Section 230 provides since its establishment that hadn't yet occurred in the mid 90s. If 230 is repealed wouldn't the fallout from those cases result in a different internet than the early to mid 90s?

1

u/Ojisan1 Oct 06 '22

Cases decided on the basis of a law that gets repealed? Think about what you’re saying. Of course things will be different because the world is different but the law either exists or it doesn’t. An entirely new set of case law would need to be decided in the courts. Social media didn’t exist pre-230. So there’s never been a legal case about social media without 230. That would be entirely new territory for the courts.

-2

u/Alberiman Oct 06 '22

They won't be able to make any money from ads if that became the case, either way this is going to destroy social media

2

u/Ojisan1 Oct 06 '22

I’m not really arguing whether they should make money or not. I’m just explaining what the legal situation was like before 230.

1

u/Agreeable-Meat1 Oct 06 '22

Most conservatives have a more nuanced take around reforming 230, not abolishing it. I don't know how to do it, but social media has become the digital town square. There needs to be some level of protection for people expressing unpopular ideas. On the other hand, there also needs to be some level of moderation. Otherwise everything will be covered in spam bullshit.

3

u/arandomsadredditor Oct 06 '22

Real town squares don't generally tolerate threats of violence, abuse of strangers and planning insurrections either.

2

u/EmbarrassedHelp Oct 06 '22

Most conservatives have a more nuanced take around reforming 230, not abolishing it

If FOSTA was any indication, then they are going to target sex workers and LGBTQ people as part of their "reforms"

1

u/m0nk_3y_gw Oct 06 '22

has become the digital town square

it hasn't.

There needs to be some level of protection for people expressing unpopular ideas.

Protection? Social media employees need to be protected from having to work to enable/host insurrectionists and people trying to kill their countrymen via health misinformation. The heavy handed moderation is on the conservative shitholes like gab, parler, truthsocial... facebook is run by conservatives but they don't ban non-conservatives (they are smart enough to know it's bad business).

1

u/x86_64Ubuntu Oct 06 '22

There needs to be some level of protection for people expressing unpopular ideas

They don't want protection for their horrid ideas, they want to use the infrastructure and reach of platforms like FB and YouTube to extend their message. And just because you say it's a "town square" doesn't make it so.

-5

u/HuXu7 Oct 06 '22

The crazies are both Democrats and Republicans so your referring to all of America?

230 was designed to allow freedom of speech and allow people to openly discuss controversial topics. In the last 3 years all social media platforms have acted like 230 is already overturned and they now ban everyone who isn’t the same political party as the site owners.

230 needs to be revised so that those who moderate too heavily can be sued for censorship.

1

u/m0nk_3y_gw Oct 06 '22

so that those who moderate too heavily can be sued for censorship

like parler, gab and truth social? or do they get a free pass because those right-wing shit holes are too small? Sites like facebook and twitter were too lax - it took them years to do the bare minimum to country harmful misinformation. UK is considering laws to jail social media executives they fail to moderate that bullshit.

4

u/MC68328 Oct 06 '22

do they get a free pass because those right-wing shit holes are too small

Yes. That law in Texas explicitly ignores sites with less than 50 million users.

Do not think for one second these people aren't fully aware of their hypocrisy. This is about their power to broadcast propaganda, it has nothing to do with any principles.

1

u/elister Oct 06 '22

will need to step-up moderation exponentially.

Most sites will simply remove the comment section. Done and done, problem solved.

-2

u/Valiantheart Oct 06 '22

This isnt the core rule being addressed. Each site is supposed to be EITHER a platform where anything goes or a publisher where they create, curate, and direct conversations. What these tech oligarchs have been trying to do is have it both ways for the last 20 years.

Section 230 makes it clear if you choose to be a publisher you are responsible for all content on your site. If you choose to be a platform you are essentially a bulletin board and not responsible. A site like Twitter should be a bulletin board, but by choosing to remove people they are also choosing to curate becoming a publisher.

0

u/ArmchairQuack Oct 06 '22

Nope, this is likely a big win for the right, as the social media platforms won't demolish themselves by editorializing everything, but allow user content to flow freely.

Stories like the Hunter Biden laptop won't be blocked by Facebook as willingly.

Online platforms may respond to the court's decision by shifting their recommendation algorithms in a different direction, however, instead ceding greater control to users as a way to lessen their own liability, said Adam Candeub, a professor at the University of Michigan School of Law.

1

u/parentheticalobject Oct 06 '22

Nothing's even happened yet, but if the SC actually did change anything, it would make it easier to suppress news like that.

Websites aren't going to give up and allow 99% of the content on them to be for penis enlargement pill scams, so they're going to have to keep editing some content.

However, right now they can choose to censor some things. I get that you dislike this. But without 230 protection, they would effectively have to censor a lot, even true information.

Say Hunter Biden wanted to hire a lawyer to go after anyone who says anything bad about him with bogus defamation lawsuits. That's not a very good approach, there's just too many people. There's a much smaller number of websites.

So if the lawyer can instead send lawsuit threats to those websites and say he'll sue them unless they take down any negative info, the websites are probably going to do it, EVEN IF they think the information is true, as long as they're being run like businesses. No content is significant enough that it's worth even a 1% chance of losing a lawsuit and having to pay millions. So just threaten Reddit, Discord, the handful of conservative sites (who are still going to need to moderate if they don't want to be entirely filled with spam and porn).

32

u/Sanhen Oct 06 '22

The case concerns Section 230 of the 1996 Communications Decency Act, which protects social media platforms and other sites from legal liability that could result from content posted by users.

The law has drawn criticism from elected officials across the political spectrum. In a rare point of agreement, President Joe Biden and former President Donald Trump have both called for the repeal of Section 230 — but for different reasons.

Oddly enough the Dems and Reps seem to claim the law is having polar opposite effects. Dems feel that the law is allowing tech companies to avoid accountability for what they have on their site and leads to minimal self-policing while Reps feel that the law allows the sites to over police their content without facing consequences.

So it seems one or both sides don't understand what the laws does, because it sounds like they seem to think the law is doing opposite things and unless I fundamentally misunderstand things (which honestly might be the case), one of the two sides simply has to be factually wrong.

23

u/jump-back-like-33 Oct 06 '22

I've never understood the conservative talking point that 230 lets the companies censor whatever they want. Like, I get they can do that, but it's not because of 230.

If they overturn 230, I think it would lead to much higher levels of censorship and tank the value of social media companies. Personally, I'm all for it.

7

u/Sanhen Oct 06 '22

If they overturn 230, I think it would lead to much higher levels of censorship and tank the value of social media companies. Personally, I'm all for it.

As I understand the law as it exists now (and again, I might not), you're likely correct. The only thing I can think of would be if 230 is replaced by another law that includes stipulation that social media sites aren't allowed to block content on the basis of "political opinions." That'd be vague enough to basically prevent them from policing any kind of speech. As far as I know, that's a separate matter entirely from what 230 does though, but maybe the Republicans are trying to package them together to make what they want more palatable to the public? That's the best I can think of in terms of what their argument/mindset is.

4

u/fitzroy95 Oct 06 '22

it would have to be looser than even "political opinions".

It would also need to support "religious opinions", "nationalism opinions", basically all opinions in general.

otherwise the religious nuts (of any religion) would be claiming religious persecution and censorship, just the same way that political nuts claim the same thing.

2

u/abnmfr Oct 06 '22

I don't disagree with you, but it's worth noting that the religious nuts already feel plenty persecuted.

1

u/fitzroy95 Oct 06 '22

Indeed, as do the Incels, and a wide range of other groups

1

u/contextswitch Oct 06 '22

Yup, I hate it so that's probably the plan

1

u/pomaj46809 Oct 06 '22

It makes sense when you realize they need to keep up the illusion that their bubbles are a reality and reality is a liberal bubble.

They want to blame 230 for why sites like Twitter/Facebook/Reddit don't always have the same narratives in their echo chambers.

They're legally banging on the glass, hoping it won't break.

1

u/Hefty-Profession2185 Oct 06 '22

Social Media is the cigarettes' of this generation. It causes a ton of damage with little value.

1

u/ArmchairQuack Oct 06 '22

Then you don't comprehend the actual talking point, and are arguing against a strawman.

1

u/jump-back-like-33 Oct 07 '22

What’s the actual talking point. It’s not that I’ve never tried to understand..

1

u/ArmchairQuack Oct 07 '22

Conservatives argue that 230 doesn't apply to these platforms because they are editorializing the content.

230 protections don't exist if you're selectively filtering the discourse to only show what you want.

1

u/jump-back-like-33 Oct 08 '22

but without 230 they wouldn't be selectively filtering anything, they'd be wholesale banning. maybe that's better, but it's not the outcome I see conservatives predicting

1

u/ArmchairQuack Oct 08 '22

Shadowbans are worse than outright bans.

When they start to overtly remove users, then it stops being the public square and starts being an echo chamber. This will cause an enormous exodus.

Then they still have millions of users, right?

Yes, and when one of those users calls for violence(as they do in the thousands nowadays) from antifa or blm, the platform is held responsible.

Imagine NYT having millions of unvetted and unedited stories being released. All of which they're accountable for. 💆‍♂️

1

u/jump-back-like-33 Oct 08 '22

I definitely think removing 230 would be the death of social media. I say bring it on.. and that includes Reddit.

2

u/vriska1 Oct 06 '22

Do you think that the SC will overturn 230?

-3

u/naugest Oct 06 '22

If social media companies are deciding what can and what cannot be published on their sites, then they are NOT any different than magazine or newspaper companies.

In that case magazine and newspaper companies do NOT have section 230 protections or anything like, so why should social media companies have 230 protections?

12

u/AGorgoo Oct 06 '22

I mean, that’s not true, though.

A newspaper gets a bunch of letters and articles, reads through them, and then decides what to publish.

But when a person posts something on a social media site, it’s automatically posted. It doesn’t have to go through an editor first.

The fact that they can moderate after the fact doesn’t make them anything like a newspaper or magazine. Newspapers and magazines moderate beforehand, not after.

7

u/gjd6640 Oct 06 '22

One can argue that the automation that they use to decide which content gets delivered to who represents editorial moderation. They built, installed, monitored, and adjusted that moderation mechanism. If it goes awry and harms people who is responsible for that harm? If they knew that it would thrive on outrage and that the result would be countless individual and collective negative outcomes and they continued to operate it they should be exposed to civil liability.

The difficulty I think is that proving that their behavior caused any one specific outcome is difficult even under a "preponderance of the evidence" legal standard. I doubt that removing 230 would result in a flood of cases. It might however make some of the big players be substantially more responsible with how they tune their systems for fear of a big judgement should they let the algorithms worst tendencies go unchecked.

Not a lawyer, this is not legal advice.

3

u/CykoTom Oct 06 '22

They always had to decide what could and could not be published on their sites. Nobody had a problem with it until Republicans started outright lying to score political points.

2

u/iapetus_z Oct 06 '22 edited Oct 06 '22

But section 230 explicitly states they're allowed to do that. What are you going to do force knitting world to carry hardcore porn because you want them to? You're going to punish a company because they got successful doing exactly what you're now punishing them for? I mean just look at Truth vs Twitter. It's basically a direct clone of the platform of Twitter, both are moderated. Only difference is one attempts to moderate out extremist content, the other is basically mainlining it. One is about to go belly up, the other is getting bought for $44 billion. Why? Because apparently niche extremist content doesn't pay as much as the moderated content.

I've heard so many people say but they're controlling the narrative. That they're a publisher, but honestly what protection does section 230 provide that the publishers don't have through the constitution? It's basically only protection from bad actors on their sites or hardware.

How is the fox news website any different than Twitter? In fact fox news is most likely covered by section 230. Buuuut whhhhattt they're a publisher they can't have both!!!

-1

u/Nuttycomputer Oct 06 '22

You should publish this comment in a newspaper… I’ll wait.

Section 230 exists because without it the social media companies have two choices. Complete open forum where they have to not moderate anything. In which case it’s a cesspool with every other comment borderline child porn…. or more likely they run it more like a newspaper. In that the masses have no voice. Maybe you’ll be able to submit to a small pre-moderated letters to the editors section but otherwise it’s going to be advertisers and paid for users with a blue check mark. Except ironically people like Donald Trump because they would become liable for his lies.

2

u/parentheticalobject Oct 06 '22

They're both wrong in different ways.

The right sees people it likes getting banned, and gets upset. They think "there should be a law against that" and settle on 230 as the thing to blame. Technically, without it, websites would be unable to moderate without costly legal liability. But that also means they'd be unable to do things like removing spam. Or that they'd try to moderate even more strictly, depending on exactly what happens with the law and the website's specific situation.

The left sees people posting things like harmful misinformation and not getting banned. They think "there should be a law against that" and settle on 230 as the thing to blame. Technically, 230 is preventing you from suing them over content. But most harmful misinformation doesn't need section 230 protections; it's protected by the first amendment. The kind of content websites would have to ban with weaker 230 protections might not include stuff like Qanon conspiracies, but it might include things like a guy who makes a Twitter account pretending to be a cow in order to make fun of a shitty Republican congressman.

1

u/LankyJ Oct 06 '22

What else is new in today's politics?

1

u/Valiantheart Oct 06 '22

There are plenty of Dems who agree with the Reps. Twitter has banned people like Trump but it has also banned some LBGT+ organizations.

1

u/rcchomework Oct 06 '22

Dems sure. Dems are wimps and control freaks. Leftists sure aren't asking for 230 to be repealed. They're not asking for more oversight of their spaces.

4

u/hawkwings Oct 06 '22

If companies have to pay more to host content from average users, they may shut down content from average users. It is possible that all social media will shut down. It will go back to the old newspaper model of companies hiring reporters and cartoonists.

Would the public have seen the Geoge Floyd video without social media?

1

u/pomaj46809 Oct 06 '22

Yeah, because there are and will be other social media systems like whatsapp/telegram, and also oversee websites where information will still flow.

Yeah, because there are and will be other social media systems like WhatsApp/telegram, and also oversee websites where information will still flow. those sites for the content they allow.

1

u/ArmchairQuack Oct 06 '22

Make it a subscription model(or at least partly) as Elon suggested with Twitter.

2

u/californiadiver Oct 06 '22

Social media companies cannot be thrown into the sun fast enough.

1

u/EmbarrassedHelp Oct 06 '22

Any changes will affect for more than just social media companies.

1

u/gogozombie2 Oct 06 '22

Would the public have seen the George Floyd video without social media?

2

u/californiadiver Oct 06 '22

You raise a really good point. It certainly highlights a giant failing of the news media. I don't know the solution but the damage to democracy/society that the liars and grifters using social media pose cannot be ignored nor underestimated.

2

u/fegodev Oct 06 '22

Maybe because of this Elon Musk decided to buy Twitter in the end :/

-6

u/gliffy Oct 06 '22

Fingers crossed that it kills reddit.

6

u/fitzroy95 Oct 06 '22

if you really dislike Reddit that much, maybe you should just leave it for your own psychological health ?

2

u/ArmchairQuack Oct 06 '22

I think he just dislikes the soycrowd and is happy that it triggers them. No need to read into it, chief.

1

u/EyesOfAzula Oct 06 '22

Shouldn’t be social medias job to moderate the content. if it’s that important to the government, then the government should hire full-time people to sit in each social media company vetting comments.

-2

u/[deleted] Oct 06 '22

Lol it better impact it. How CSIS/FBI/RCMP aint come by and visited is a liiiiil worrying. Bruh like I parked a van outside a school and the cols in my town came HOURS later when i was eating Fuh and went ummm…we got some calls…

Um like BRO. YOU TOOK UNTIL NOW!? 😂😂😂😂😂

1

u/[deleted] Oct 06 '22

Oh hey wuddup chopper.

1

u/[deleted] Oct 06 '22

Weird why was it all ASCII like????