this post was submitted on 28 Feb 2024
35 points (100.0% liked)

SneerClub

989 readers
20 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

In her sentencing submission to the judge in the FTX trial, Barbara Fried argues that her son is just a misunderstood altruist, who doesn't deserve to go to prison for very long.

Excerpt:

One day, when he was about twelve, he popped out of his room to ask me a question about an argument made by Derik Parfit, a well-known moral philosopher. As it happens, | am quite familiar with the academic literature Parfi’s article is a part of, having written extensively on related questions myself. His question revealed a depth of understanding and critical thinking that is not all that common even among people who think about these issues for a living. ‘What on earth are you reading?” I asked. The answer, it turned out, was he was working his way through the vast literature on utiitarianism, a strain of moral philosophy that argues that each of us has a strong ethical obligation to live so as to alleviate the suffering of those less fortunate than ourselves. The premises of utilitarianism obviously resonated strongly with what Sam had already come to believe on his own, but gave him a more systematic way to think about the problem and connected him to an online community of like-minded people deeply engaged in the same intellectual and moral journey.

Yeah, that "online community" we all know and love.

you are viewing a single comment's thread
view the rest of the comments
[–] 200fifty@awful.systems 16 points 9 months ago (2 children)

each of us has a strong ethical obligation to live so as to alleviate the suffering of those less fortunate than ourselves

Sounds like he did a bad job at living up to those principles then, huh?

Also is it just me or is this not actually a very good description of utilitarian beliefs lol

[–] skillissuer@discuss.tchncs.de 13 points 9 months ago

nuh uh, you see, he was scamming the poors because he would use their money in most efficientest way to prevent robot overlords. if scamming the poors is wrong then why did invisible hand of free market made it so easy?

It doesn’t do a bad job of cashing out a fairly strong corollary of utilitarianism which is generally taken to be characteristic of any utilitarian theory worth its salt viz. since each of us is only one person, and the utilitarian calculus calls for us to maximise happiness (or similar), then insofar as each of us only bears moral weight equal to one (presumably equal sized) fraction of that whole, therefore our obligations to others (insofar as the happiness of others obliges us) swamp our own personal preferences. Furthermore, insofar as (without even being a negative utilitarian) suffering is very bad, the alleviation of suffering is a particularly powerful such obligation when our responsibilities to each individual sufferer are counted up.

This is generally taken to be sufficiently characteristic of utilitarianism that objections against utilitarianism frequently cite this “demandingness” as an implausible consequence of any moral theory worth having.

So in isolation it makes some sense as shorthand for a profound consequence of utilitarianism the theory which utilitarians themselves frequently stand up as a major advantage of their position, even as opponents of utilitarianism also stand it up for being “too good” and not a practical theory of action.

In reality it’s a poor description of utilitarian beliefs, as you say, because the theory is not the person, and utilitarians are, on average, slightly more petty and dishonest than the average person who just gives away something to Oxfam here and there.