this post was submitted on 23 Oct 2023
1272 points (96.0% liked)

4chan

4175 readers
275 users here now

Greentexts, memes, everything 4chan.

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] Kase@lemmy.world 1 points 1 year ago (1 children)

Jesus christ that's dystopian

[–] SCB@lemmy.world 2 points 1 year ago (2 children)

It's not so much dystopian as it is just buggy software

[–] Kase@lemmy.world 2 points 1 year ago (3 children)

Ah ok. I don't know much about it, but I've heard that AI could sometimes be negative toward commonly discriminated against groups because the data that it's trained with is. (Side note: is that true? someone pls correct me if it's not). I jumped to the conclusion that this was the same thing. My bad

[–] adrian783@lemmy.world 3 points 1 year ago

what it did it expose just how much inherent bias there is in hiring. even just name and gender alone.

[–] SCB@lemmy.world 3 points 1 year ago

That is both true and pivotal to this story

It's a major hurdle in some uses of AI

[–] TAG@lemmy.world 1 points 1 year ago

An AI is only as good as its training data. If the data is biased, then the AI will have the same bias. The fact that going to a women's college was considered a negative (and not simply marked down as an education of unknown quality) is proof against the idea that many in the STEM field hold (myself included) that there is a lack of qualified female candidates but not an active bias against them.

[–] matter@lemmy.world 1 points 1 year ago (1 children)

When buggy software is used by unreasonably powerful entities to practise (and defend) discrimination that's dystopian...

[–] SCB@lemmy.world 2 points 1 year ago

Except it wasn't actually launched, and they didn't defend its discrimination but rather ended the project.