this post was submitted on 18 Dec 2023
15 points (100.0% liked)

SneerClub

991 readers
1 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

... while at the same time not really worth worrying about so we should be concentrating on unnamed alleged mid term risks.

EY tweets are probably the lowest effort sneerclub content possible but the birdsite threw this to my face this morning so it's only fair you suffer too. Transcript follows:

Andrew Ng wrote:

In AI, the ratio of attention on hypothetical, future, forms of harm to actual, current, realized forms of harm seems out of whack.

Many of the hypothetical forms of harm, like AI "taking over", are based on highly questionable hypotheses about what technology that does not currently exist might do.

Every field should examine both future and current problems. But is there any other engineering discipline where this much attention is on hypothetical problems rather than actual problems?

EY replied:

I think when the near-term harm is massive numbers of young men and women dropping out of the human dating market, and the mid-term harm is the utter extermination of humanity, it makes sense to focus on policies motivated by preventing mid-term harm, if there's even a trade-off.

you are viewing a single comment's thread
view the rest of the comments
[–] dgerard@awful.systems 7 points 11 months ago (1 children)

Way back in the Sequences days, Yudkowsky talked about memetic hazards wiping out Western Civilization, this is an old theme of his

[–] gerikson@awful.systems 9 points 11 months ago* (last edited 11 months ago) (1 children)

So he wrote that in 2007. Since then, games have only gotten more immersive according to his definition, so people dying of too much gaming should be a massive issue. As far as I know, it is not. People can fuck their lives up in other ways, but arguably straight up gambling is worse as it draws off way more real money from people that could have gone to education, housing etc.

Yud likes to argue from first principles (obviously), but doesn't reckon on social dynamics. If games were as bad as he describes, there would be regulation around them. Presumably if AI girlfriends become a threat to future pension payments, they will be regulated also.

[–] dgerard@awful.systems 8 points 11 months ago* (last edited 11 months ago) (1 children)

see also the similar deleterious social effects of chess addiction in history (mostly as part of bans on gambling)

[–] gerikson@awful.systems 11 points 11 months ago (1 children)

Their crippling addiction

My worthy pastime

[–] locallynonlinear@awful.systems 7 points 11 months ago (2 children)

Completely unrelated, but Everytime I see your avatar in the tiny minimized form I see Squidward's face, and then your comments get 20% more amusing.

[–] 200fifty@awful.systems 4 points 11 months ago (1 children)

Oh man, I won't be able to unsee this, lol

[–] gerikson@awful.systems 4 points 11 months ago (1 children)

Me neither. Poor Elden Ring jellyfish!

[–] Soyweiser@awful.systems 4 points 11 months ago

Do you feel attacked? Because now is the time to switch to the red jellyfish.

[–] BernieDoesIt@kbin.social 4 points 11 months ago

Today I learned that gerikson's avatar isn't Squidward.