this post was submitted on 08 Sep 2023
2 points (100.0% liked)

SneerClub

991 readers
1 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

Choice quote:

Putting “ACAB” on my Tinder profile was an effective signaling move that dramatically improved my chances of matching with the tattooed and pierced cuties I was chasing.

top 5 comments
sorted by: hot top controversial new old
[–] outer_spec@lemmy.blahaj.zone 3 points 10 months ago
[–] swlabr@awful.systems 1 points 1 year ago

What I’m hearing is: guy gains wealth and cultural capital, realises he is steadily approaching the echelon of society that the police actually protect, decides they are not so bad after all, and now wants privatisation of police so that he can build a personal army.

Like sure it can be argued that policing as it is right now has some benefit to social order, and it definitely can be argued that the situation can be improved. On the latter point, calls to defund/abolish the police are a valid means to that end. Yet OP for whatever reason has decided that the only real solution is literally that libertarian cop copypasta.

[–] Gradually_Adjusting@lemmy.world 0 points 1 year ago (1 children)

Less Wrong introduced me to a lot of interesting ideas, like how you can apply Bayesian reasoning to beliefs and making your beliefs "pay rent", but I'm not in love with the Sam Bankman-Fried of it all.

[–] zogwarg@awful.systems 0 points 1 year ago (1 children)

It’s not bayesian reasoning without actual math, and many beliefs are not so easily quantified under any statistical framework.

All it really offers is unwarranted confidence in one’s own rationality, often used in these circles to cloak nauseating positions.

Making ideas « pay rent », In these circles is also used for black-pilling people into rejecting common sense and humanity. It’s good to be skeptical of new ideas or new claims, it’s even good to analyze and synthesize your own beliefs, it’s a bit dangerous to say that every belief is negotiable, and to give the tools to others the tools to mould them (here be cult dragons).

Despite it’s flaws, it is a good opening to the Declaration of Independence of the US (not American myself): « we hold these truths to be self-evident, that all [people] are […] equal […] with certain unalienable rights »

You can’t get morals from stats, and some core ideals ought to live in your mind rent-free.

I haven't talked to these people on any regular basis, other than once attending one of their weird little seminars at someone's house once, so I've had little to no experience with how they apply these ideas. I just read the blog posts and mulled it over on my own for a while, and my main takeaways were something like "change your mind incrementally when you get new evidence".

That said everything you're saying does ring true, and I've been changing my mind about Yudkowsky and his ilk pretty gradually for a number of years. Hearing that he's in with dudes like SBF have made me ready to fully distance myself from their stuff now that I know what they get up to.

I never heard of applying the ad hoc Bayesian thing to moral stances. I'd only ever applied it to questions of fact. Creepy to think where that leads.

Thanks for chatting with me about this, it's been helpful.