this post was submitted on 17 Aug 2023
816 points (95.9% liked)

Technology

59631 readers
2661 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

YouTube and Reddit are sued for allegedly enabling the racist mass shooting in Buffalo that left 10 dead::The complementary lawsuits claim that the massacre in 2022 was made possible by tech giants, a local gun shop, and the gunman’s parents.

top 50 comments
sorted by: hot top controversial new old
[–] Otkaz@lemmy.world 148 points 1 year ago (65 children)

It use to be video games and movies taking the blame. Now it's websites. When are we going to decide that people are just bat shit crazy and guns need some form of regulation?

[–] SocialMediaRefugee@lemmy.world 71 points 1 year ago (7 children)

Because every gun owner thinks they are "the good guys"

[–] some_guy@lemmy.sdf.org 28 points 1 year ago

Because every gun owner thinks they are “the good guys”

Just wait till I use my gun to save a bunch of lives. Then you'll see that I'm a hero. /s

[–] ZombieZookeeper@lemmy.world 11 points 1 year ago (3 children)

And most of them fantasize about killing liberals and BLM protesters.

load more comments (3 replies)
load more comments (5 replies)
[–] DarkWasp@lemmy.world 22 points 1 year ago

I can see the nuance in an argument that an online community, unmoderated, could be using an algorithm to group these violent people together and amplifying their views. The same can't really be said for most other platforms. Writing threats of violence should still be taken seriously over the internet, especially if it was later acted upon. I don't disagree with you that there's a lot of bat shit crazy out there though.

load more comments (62 replies)
[–] TacoButtPlug@sh.itjust.works 102 points 1 year ago (1 children)

Idk about this suit but let's not forget how Facebook did actually in fact get a fascist elected president.

https://www.wired.com/2016/11/facebook-won-trump-election-not-just-fake-news/

load more comments (1 replies)
[–] NuPNuA@lemm.ee 85 points 1 year ago (23 children)

It's bizarre looking at this from the outside and seeing Americans trying to blame everything but the availablity of guns for shootings happening.

[–] GiddyGap@lemm.ee 31 points 1 year ago (1 children)

Many Americans will sacrifice a lot for their guns. Including school children and the ability to live in a safe society.

[–] NuPNuA@lemm.ee 25 points 1 year ago (1 children)

Coming from a country that had a couple of school shootings and then decided it wasn't worth the risk, and everyone handed in their guns with little complaint, I find it hard to comprehend.

[–] Bop@lemmy.film 19 points 1 year ago

It's hard to comprehend from the inside. This country is full of traumatizing shit that's really hard to face.

[–] ArbitraryValue@sh.itjust.works 19 points 1 year ago* (last edited 1 year ago) (1 children)

Well, even Americans without guns are much more violent than people in other first-world countries. Our non-gun homicide rate is higher than the total homicide rate in (for example) France or Germany.

There's an interesting discussion of the statistics here.

So my interpretation is that gun control is likely to reduce the murder rate, but the change will not be nearly as dramatic as many gun-control supporters seem to expect. Guns aren't most of the problem.

load more comments (1 replies)
[–] SocialMediaRefugee@lemmy.world 10 points 1 year ago

It is a quasi-religious thing. They would rather risk their kids dying than even accept the most basic regulations.

load more comments (20 replies)
[–] JustZ@lemmy.world 79 points 1 year ago (4 children)

Fantastic. I've been waiting to see these cases.

Start with a normal person, get them all jacked up on far right propaganda, then they go kill someone. If the website knows people are being radicalized into violent ideologies and does nothing to stop it, that's a viable claim for wrongful death. It's about foreseeability and causation, not about who did the shooting. Really a lot of people coming in on this thread who obviously have no legal experience.

[–] sturmblast@lemmy.world 38 points 1 year ago (8 children)

I just don't understand how hosting a platform to allow people to talk would make you liable since you're not the one responsible for the speech itself.

[–] theluddite@lemmy.ml 45 points 1 year ago (6 children)

Is that really all they do though? That's what theyve convinced us that they do, but everyone on these platforms knows how crucial it is to tweak your content to please the algorithm. They also do everything they can to become monopolies, without which it wouldn't even be possible to start on DIY videos and end on white supremacy or whatever.

I wrote a longer version of this argument here, if you're curious.

load more comments (6 replies)
[–] Pyr_Pressure@lemmy.ca 14 points 1 year ago

I agree to a point, but think that depending on how things are structured on the platform side they can have some responsibility.

Think of facebook. They have algorithms which make sure you see what they think you want to see. It doesn't matter if that content is hateful and dangerous, they will push more of that onto a damaged person and stoke the fires simply because they think it will make them more advertisement revenue.

They should be screening that content and making it less likely for anyone to see it, let alone damaged people. And I guarantee you they know which of their users are damaged people just from comment and search histories.

I'm not sure if reddit works this way, due to the upvotes and downvote systems, it may be moreso the users which decide the content you see, but reddit has communities which they can keep a closer eye on to prevent hateful and dangerous content from being shared.

[–] CaptainAniki@lemmy.flight-crew.org 11 points 1 year ago (2 children)

Because you are responsible for hiring psychologists to tailor a platform to boost negative engagement, and now there will be a court case to determine culpability.

load more comments (2 replies)
load more comments (5 replies)
[–] SCB@lemmy.world 16 points 1 year ago (6 children)

a viable claim for wrongful death

Something tells me you're not a lawyer.

load more comments (6 replies)
load more comments (2 replies)

Fuck reddit but thats bs.

[–] SCB@lemmy.world 37 points 1 year ago

YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack. Similarly, the lawsuits claim Reddit promoted extreme content and offered a specialized forum relating to tactical gear.

Yeah this is going nowhere.

[–] foggy@lemmy.world 33 points 1 year ago

The algorithm feeds on fear and breeds anger. This much is true.

[–] SocialMediaRefugee@lemmy.world 30 points 1 year ago (2 children)

Say what you want about youtube and reddit but if you want them to censor more and more you are creating a sword that can be used against you too. I also don't like the idea of shooting the messenger no matter how much we may dislike the messages. When I hear lawsuits like this I always think it is greedy lawyers pushing people to sue because they see deep pockets.

[–] TyrionsNose@lemmy.world 12 points 1 year ago (6 children)

Right, so then they should be operated as a public telecom and be regulated as Title II. This would allow them to be free from such lawsuits.

However, they want to remain as private for profit companies so they should be held responsible for not acting responsibly.

load more comments (6 replies)
load more comments (1 replies)
[–] some_guy@lemmy.sdf.org 25 points 1 year ago (2 children)

The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.

This seems like the only part of the suits that might have traction. All the other bits seem easy to dismiss. That's not a statement on whether others share responsibility, only on what seems legally actionable in the US.

load more comments (2 replies)
[–] honey_im_meat_grinding@lemmy.blahaj.zone 25 points 1 year ago* (last edited 1 year ago) (7 children)

The article doesn't really expand on the Reddit point: apart from the weapon trading forum, it's about the shooter being a participant in PoliticalCompassMemes which is a right wing subreddit. After the shooting the Reddit admins made a weak threat towards the mods of PCM, prompting the mods to sticky a "stop being so racist or we'll get deleted" post with loads of examples of the type of racist dog whistles the users needed to stop using in the post itself.

I don't imagine they'll have much success against Reddit in this lawsuit, but Reddit is aware of PCM and its role and it continues to thrive to this day.

load more comments (7 replies)
[–] mister_monster@monero.town 24 points 1 year ago (2 children)

They're just throwing shit at the wall to see what sticks hoping to get some money. Suing google for delivering search results? It shows how ridiculous blaming tools is. The only person liable here is the shooter.

[–] joe@lemmy.world 30 points 1 year ago* (last edited 1 year ago)

Well, maybe. I want to be up-front that I haven't read the actual lawsuit, but it seems from the article that the claim is that youtube and reddit both have an algorithm that helped radicalize him:

YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack. Similarly, the lawsuits claim Reddit promoted extreme content and offered a specialized forum relating to tactical gear.

I'd say that case is worth pursuing. It's long been known that social media companies tune their algorithms to increase engagement, and that pissed off people are more likely to engage. This results in algorithms that output content that makes people angry, by design, and that's a choice these companies make, not "delivering search results".

[–] dublet@lemmy.world 15 points 1 year ago (13 children)

The only person liable here is the shooter.

On the very specific point of liability, while the shooter is the specific person that pulled the trigger, is there no liability for those that radicalised the person into turning into a shooter? If I was selling foodstuffs that poisoned people I'd be held to account by various regulatory bodies, yet pushing out material to poison people's minds goes for the most part unpunished. If a preacher at a local religious centre was advocating terrorism, they'd face charges.

The UK government has a whole ream of context about this: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/97976/prevent-strategy-review.pdf

Google's "common carrier" type of defence takes you only so far, as it's not a purely neutral party in terms, as it "recommends", not merely "delivers results", as @joe points out. That recommendation should come with some editorial responsibility.

load more comments (13 replies)
[–] autotldr@lemmings.world 16 points 1 year ago

This is the best summary I could come up with:


YouTube, Reddit and a body armor manufacturer were among the businesses that helped enable the gunman who killed 10 Black people in a racist attack at a Buffalo, New York, supermarket, according to a pair of lawsuits announced Wednesday.

The complementary lawsuits filed by Everytown Law in state court in Buffalo claim that the massacre at Tops supermarket in May 2022 was made possible by a host of companies and individuals, from tech giants to a local gun shop to the gunman’s parents.

The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.

YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.

“We aim to change the corporate and individual calculus so that every company and every parent recognizes they have a role to play in preventing future gun violence,” said Eric Tirschwell, executive director of Everytown Law.

Last month, victims’ relatives filed a lawsuit claiming tech and social media giants such as Facebook, Amazon and Google bear responsibility for radicalizing Gendron.


I'm a bot and I'm open source!

[–] TIEPilot@lemmy.world 15 points 1 year ago* (last edited 1 year ago) (6 children)
  • RMA Armament is named for providing the body armor Gendron wore during the shooting.

No he bought it.

  • Vintage Firearms of Endicott, New York, is singled out for selling the shooter the weapon used in the attack.

Not their issue he passed the background check.

  • The lawsuit claims Mean LLC manufactured an easily removable gun lock, offering a way to circumvent New York laws prohibiting assault weapons and large-capacity magazines.

Any knob w/ a dremel can make a gun full auto, let alone defeating a mag lock. And he broke NY law doing this.

  • YouTube, named with parent companies Alphabet Inc. and Google, is accused of contributing to the gunman’s radicalization and helping him acquire information to plan the attack.

This is just absurd.

My guess is they are hoping for settlements vs going to trial where they lose.

load more comments (6 replies)
[–] elscallr@lemmy.world 10 points 1 year ago

This is really, really stupid.

[–] iHUNTcriminals@lemm.ee 9 points 1 year ago* (last edited 1 year ago) (4 children)

The village (community/lack of community) makes the villains. Everyone's a problem. We are all to blame.

load more comments (4 replies)
load more comments
view more: next ›