this post was submitted on 02 Feb 2024
216 points (98.2% liked)

Technology

59575 readers
3234 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] FiskFisk33@startrek.website 43 points 9 months ago* (last edited 9 months ago) (3 children)

Body camera video equivalent to 25 million copies of “Barbie”

Literally anything but the metric system

[–] rickyrigatoni@lemm.ee 13 points 9 months ago (1 children)

it's called SEO and it's an art

[–] FiskFisk33@startrek.website 5 points 9 months ago (1 children)

SEO os the bane of the internet. SEO is why i have to scroll through a novella every time i want to check out a recipe

[–] maness300@lemmy.world 1 points 9 months ago (1 children)

Oh that's so fucking annoying, but I also think it's part of the culture among those who typically submit recipes online.

[–] FiskFisk33@startrek.website 1 points 9 months ago

it may have started that way, but now it's all SEO

[–] a1studmuffin@aussie.zone 5 points 9 months ago

We don't even need to choose! Just use hours, months, years, decades! But no, Barbie movies.

[–] Everythingispenguins@lemmy.world 4 points 9 months ago* (last edited 9 months ago) (2 children)

There is no common metric measure of time.

Edit -common

[–] SkybreakerEngineer@lemmy.world 5 points 9 months ago

I'm sorry, can you restate that in terms of the number of ground state transitions of a Cs-136 atom?

[–] TonyTonyChopper@mander.xyz 3 points 9 months ago (1 children)
[–] Everythingispenguins@lemmy.world 2 points 9 months ago (2 children)

Yes it is, but SI is not all metric. Metric is fundamentally a base 10 system. Time is base 60 you can probably thank the ancient Sumerians for that but there's some debate.

At one point the French tried to make metric time a thing but it didn't stick.

https://en.m.wikipedia.org/wiki/Decimal_time

load more comments (2 replies)
[–] Evkob@lemmy.ca 34 points 9 months ago (3 children)

Body camera video equivalent of 25 million copies of "Barbie"

Is this a typical unit of measurement in journalism? Like what even is this? Crappy in-article advertising? Some weird SEO shit? An odd attempt to be cool and hip?

[–] Darkassassin07@lemmy.ca 20 points 9 months ago (2 children)

It's America; anything but metric.

[–] AtmaJnana@lemmy.world 2 points 9 months ago (1 children)

And which "metric" unit of time measurement do you prefer?

[–] Darkassassin07@lemmy.ca 3 points 9 months ago (1 children)
[–] AtmaJnana@lemmy.world 3 points 9 months ago* (last edited 9 months ago)

I prefer seconds since 00:00:00 on Jan 1st, 1970

[–] Misconduct@lemmy.world 1 points 9 months ago (1 children)

Ah yes, the metric measurement of time. My favorite.

[–] drbluefall@toast.ooo 5 points 9 months ago
[–] HawlSera@lemm.ee 1 points 9 months ago

It's the kind of phrasing you use when you're paid for how long an article is, but not how good it is.

[–] octopus_ink@lemmy.ml 31 points 9 months ago

That sounds like a big investment to find no wrongdoing by officers.

[–] OmnislashIsACloudApp@lemmy.world 24 points 9 months ago (1 children)

oh great I'm sure the training for this will not result in a bunch of things getting "reviewed" and no one being responsible for mistakes at all...

[–] 1984@lemmy.today 1 points 9 months ago

Sounds like humans, so I guess it's AI progress? :p

[–] DontMakeMoreBabies@kbin.social 21 points 9 months ago (5 children)

Would you rather these things never be reviewed? Isn't something better than nothing?

You'll literally never be able to afford (or hire) enough people to review the data they are taking in...

I mean unless we start killing billionaires and taking their shit.

[–] otter@lemmy.ca 9 points 9 months ago

Yea I share the same concerns about the "AI", but this sounds like a good thing. It's going through footage that wasn't going to be looked at (because there wasn't a complaint / investigation), and it's flagging things that should be reviewed. It's a positive step

What we should look into for this program is

  • how the flags are being set, and what kind of interaction will warrant a flag
  • what changes are made to training as a result of this data
  • how the privacy is being handled, and where the data is going (ex. Don't use this footage to train some model, especially because not every interaction is out in the public)
[–] Darkassassin07@lemmy.ca 6 points 9 months ago (1 children)

Make it publicly accessible. It'll most certainly get watched and problems will be reported to be investigated further.

[–] Hawk@lemmy.dbzer0.com 3 points 9 months ago

Corporations would be delighted to analyze all this footage.

[–] Rivalarrival@lemmy.today 2 points 9 months ago

File a complaint, and you get to view the video. If nobody files a complaint, there is no need to view the video.

Indeed, nobody should be looking at the video unless a complaint is filed.

[–] PhlubbaDubba@lemm.ee 1 points 9 months ago

Well I mean you could rig the cameras to turn on when the cop gets out of their car to break the footage into specific encounters where the cop had to interact with someone. Identify the files by the date, time, and badge number of the cop the camera is assigned to, and now you've got an easy to search database of footage whenever an incident is reported either by the cop because they had to issue paperwork for it or by whoever they were interacting with because they want to lodge a complaint.

While randomly selecting files not involved in ongoing investigation as potential training material could be helpful, we don't actually HAVE to have an assigned review resource to scan for bad behaviour or relevant material to investigations since in both cases someone is incentivized to start the process that will pull the relevant footage anyways.

load more comments (1 replies)
[–] terminhell@lemmy.world 20 points 9 months ago (3 children)

What if all the cam footage was just uploaded to something like YouTube. Publicly visible by ya know, the very citizens that pay for it and work for...

[–] Hawk@lemmy.dbzer0.com 4 points 9 months ago (1 children)

Wouldn't that be a huge privacy issue?

[–] lir@lemmy.world 1 points 9 months ago

The police are already a huge rights issue when they're acting without oversight

load more comments (2 replies)
[–] ech@lemm.ee 12 points 9 months ago

Ah, good. I had "racist profiling ~~AI~~LLM" on my 2024 bingo card

[–] Darkassassin07@lemmy.ca 10 points 9 months ago

Yes, because AI has a firm grasp on nuanced topics like law enforcement and civilian/human rights...

You may as well play the video to an empty room.

[–] themurphy@lemmy.world 7 points 9 months ago* (last edited 9 months ago) (3 children)

ITT: People who are scared of things they don't understand, which in this case is AI.

In this case, the "AI" program is nothing more than pattern recognition software setting a timestamp where it believes there's something to be looked at. Then an officer can take a look.

It saves so much time, and it filters out anything irrelevant. But be careful because it's labelled "AI". Scarry.

EDIT: Comments to this comment confirms that you don't understand AI, because if you did, you'd know that this system who scans video is not a LLM (large language model). It's not even the same system in its core.

[–] Voroxpete@sh.itjust.works 11 points 9 months ago* (last edited 9 months ago) (1 children)

This is an astonishingly bad take.

Almost every AI system is a black box. Even if you open source the code and the training data, it's almost impossible to know anything about the current state of a machine learning model.

So the entire premise here is that a completely unaccountable system - whose decisions are basically impossible to understand or scrutinize - gets to decide what data is or isn't relevant.

When an AI says "No crime spotted here", who gets to even know that it did that? If a human is reviewing all of the footage, then why have the AI? You're doing the same amount of human work anyway. So as soon as you introduce this system, you remove a huge amount of human oversight, and replace it with decisions that dramatically affect human lives - that could potentially be life or death if it's the difference between a bad cop being taken off the street or not - being made by a completely unaccountable system.

Whose to say if the training data fed into this system results in it, say, becoming effectively blind to police violence against black people?

And if that doesn't scare you, it absolutely should.

load more comments (1 replies)
[–] Killing_Spark@feddit.de 10 points 9 months ago

It's also potentially skipping some of the parts that should be looked at. It depends on the training set.

[–] fluxion@lemmy.world 4 points 9 months ago

It's not that AI is scary, it's that AI is dumb as fuck.

[–] badbytes@lemmy.world 6 points 9 months ago

And I'm sure the criminal acts by police will get filtered out.

[–] HawlSera@lemm.ee 3 points 9 months ago

I wonder if it's one of those AI that can't see darker skin colors...

load more comments
view more: next ›