this post was submitted on 17 Feb 2025
20 points (100.0% liked)

TechTakes

1638 readers
59 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 2 years ago
MODERATORS
 

Need to let loose a primal scream without collecting footnotes first? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

(page 3) 44 comments
sorted by: hot top controversial new old
[–] dgerard@awful.systems 24 points 3 days ago

https://mastodon.gamedev.place/@lritter/114001505488538547

master: welcome to my Smart Home

student: wow. how is the light controlled?

master: with this on-off switch

student: i don't see a motor to close the blinds

master: there is none

student: where is the server located?

master: it is not needed

student: excuse me but what is "Smart" about all of this?

master: everything.

in this moment, the student was enlightened

[–] HotGarbage@awful.systems 9 points 3 days ago* (last edited 3 days ago) (1 children)
[–] swlabr@awful.systems 8 points 3 days ago

116 million

There’s no way that what they’re buying is worth that much.

[–] BlueMonday1984@awful.systems 7 points 3 days ago (2 children)

In other news, Brian Merchant's going full-time on Blood in the Machine.

Did notice a passage in the annoucement which caught my eye:

Meanwhile, the Valley has doubled down on a grow-at-all-costs approach to AI, sinking hundreds of billions into a technology that will automate millions of jobs if it works, might kneecap the economy if it doesn’t, and will coat the internet in slop and misinformation either way.

I'm not sure if its just me, but it strikes me as telling about how AI's changed the cultural zeitgeist that Merchant's happily presenting automation as a bad thing without getting backlash (at least in this context).

[–] YourNetworkIsHaunted@awful.systems 10 points 3 days ago (4 children)

I mean, I love the idea of automation in the high level. Being able to do more stuff with less human time and energy spent is objectively great! But under our current economic system where most people rely on selling their time and energy in order to buy things like food and housing, any decrease in demand for that labor is going to have massive negative impacts on the quality of life for a massive share of humanity. I think the one upside of the current crop of generative AI is that it ~~threatens~~ claims to threaten actual white-collar workers in the developed world rather than further imisserating factory workers in whichever poor country has the most permissive labor laws. It's been too easy to push the human costs of our modern technology-driven economy under the proverbial rug, but the middle management graphic design Chadleys of the US and EU are finding it harder to pretend they don't exist because now it's coming for them too.

[–] BlueMonday1984@awful.systems 5 points 2 days ago

On a semi-related note, I suspect we're gonna see a pushback against automation in general at some point, especially in places where "shitty automation".

load more comments (3 replies)
[–] jonhendry@awful.systems 7 points 3 days ago

will automate millions of jobs if it works, might kneecap the economy

will kneecap the economy if it works, too. Because companies certainly aren't going to keep people employed in those millions of jobs.

[–] froztbyte@awful.systems 9 points 3 days ago (1 children)

this article came to mind for something I was looking into, and then on rereading it I just stumbled across this again:

Late one afternoon, as they looked out the window, two airplanes flew past from opposite directions, leaving contrails that crossed in the sky like a giant X right above a set of mountain peaks. Punchy with excitement, they mused about what this might mean, before remembering that Google was headquartered in a place called Mountain View. “Does that mean we should join Google?” Hinton asked. “Or does it mean we shouldn’t?”

[–] froztbyte@awful.systems 8 points 3 days ago (1 children)

But Hinton didn’t want Yu to see his personal humidifying chamber, so every time Yu dropped in for a chat, Hinton turned to his two students, the only other people in his three-​person company, and asked them to disassemble and hide the mattress and the ironing board and the wet towels. “This is what vice presidents do,” he told them.

so insanely fucking unserious

[–] jonhendry@awful.systems 7 points 3 days ago (1 children)

Has he never heard of a humidifier? Good lord.

[–] istewart@awful.systems 4 points 2 days ago

I am willing to bet the upshot here is that he has certain very specific ideas about how humidifiers can be improved, and of course will accept nothing less

[–] Soyweiser@awful.systems 8 points 3 days ago* (last edited 3 days ago)

Amazon Prime pulling some AI bullshit with, considering the bank robbery in the movie was to pay for surgery for a trans woman, a hint of transphobia (or more likely, not a hint, just the full reason).

[–] o7___o7@awful.systems 6 points 3 days ago* (last edited 3 days ago) (1 children)

What if HPMOR but Harry is Charlie Manson?

--2025, apparently

[–] swlabr@awful.systems 5 points 3 days ago

FR that’s basically this anime in which MC is isekai’d and starts a cult

[–] skillissuer@discuss.tchncs.de 9 points 4 days ago (1 children)
[–] skillissuer@discuss.tchncs.de 8 points 3 days ago* (last edited 3 days ago)

Deep Research is the AI slop of academia — low-quality research-slop built for people that don't really care about quality or substance, and it’s not immediately obvious who it’s for.

it's weird that Ed stops there, since answer almost writes itself. ludic had a bit about how in companies bigger than three guys in a shed, people who sign software contracts don't use that software in any normal way;

The idea of going into something knowing about it well enough to make sure the researcher didn't fuck something up is kind of counter to the point of research itself.

conversely, if you have no idea what are you doing, you won't be able to tell if machine generated noise is in any way relevant or true

The whole point of hiring a researcher is that you can rely on their research, that they're doing work for you that would otherwise take you hours.

but but, this lying machine can output something in minutes so this bullshit generator obviously makes human researchers obsolete. this is not for academia because it's utterly unsuitable and google scholar beats it badly anyway; this is not for wide adoption because it's nowhere near free tier; this is for idea guys who have enough money to shell out $whatever monthly subscription and prefer to set a couple hundred of dollars on fire instead of hiring a researcher/scientist/contractor. especially keeping in mind that contractor might tell them something they don't want to hear, but this lmgtfy x lying box (but worse, because it pulls lots of seo spam) won't

OpenAI's next big thing is the ability to generate a report that you would likely not be able to use in any meaningful way anywhere, because while it can browse the web and find things and write a report, it sources things based on what it thinks can confirm its arguments rather than making sure the source material is valid or respectable.

e: this is also insidious and potent ~~attack surface~~ marketing opportunity against clueless monied people who trust these slop machines for some reason. and it might be exploitable by tuning seo just right

[–] froztbyte@awful.systems 11 points 4 days ago* (last edited 4 days ago) (2 children)

found in the wild, The Tech Barons have a blueprint drawn in crayon

speaking of shillrinivan, anyone heard anything more about cult school after the news that no-one like bryan's shitty food packs?

[–] skillissuer@discuss.tchncs.de 8 points 3 days ago (1 children)

wait that's it? he wants to "replace" states with (vr) groupchats on blockchain? it can't be this stupid, you must be explaining this wrong (i know, i know, saying it's just that makes it look way more sane than it is)

The basic problem here is that Balaji is remarkably incurious about what states actually do and what they are for.

libertarians are like house cats etc etc

In practice, it's a formula for letting all the wealthy elites within your territorial borders opt out of paying taxes and obeying laws. And he expects governments will be just fine with this because… innovation.

this is some sovereign citizen type shit

[–] froztbyte@awful.systems 7 points 3 days ago (1 children)

yeah shillrinivan's ideas are extremely Statisism: Sims Edition

I've also seen essentially ~0 thinking from any of them on how to treat corner cases and all that weird messy human conflict shit. but code is law! rah!

(pretty sure that if his unearned timing-fortunes ever got threatened by some coin contract gap or whatever, he'd instantly be all over getting that shit blocked)

[–] skillissuer@discuss.tchncs.de 9 points 3 days ago (2 children)

code is law, as in, who controls the code controls the law. the obvious thing would be that monied founders would control the entire thing, like in urbit. i still want to see how well cyber hornets defend against tank rounds, or who gets to get inside tank for that matter, or how do you put tank on a blockchain. or how real states make it so that you can have citizenship of only one state, maybe two. there's nothing about it there

[–] o7___o7@awful.systems 7 points 3 days ago (1 children)

Some kind of Civ4-ass tech tree lets you get the Internet before replaceable parts or economics.

[–] froztbyte@awful.systems 4 points 3 days ago

or how real states make it so that you can have citizenship of only one state, maybe two. there's nothing about it there

come on we both know it’ll be github badges or something like that

[–] YourNetworkIsHaunted@awful.systems 7 points 4 days ago* (last edited 4 days ago) (3 children)

Having read the whole book, I am now convinced that this omission is not because Srinivasan has a secret plan that the public would object to. The omission, rather, is because Balaji just isn't bright enough to notice.

That's basically the entire problem in a nutshell. We've seen what people will fill that void with and it's "okay but I have power here now and I dare you to tell me I don't" and you know who happens to have lots of power? That's right, it's Balaji's billionaire bros! But this isn't a sinister plan to take over society - that would at least entail some amount of doing what states are for.

Ed:

"Who is really powerful? The billionaire philanthropist, or the journalist who attacks him over his tweets?"

I'm not going to bother looking up which essay or what terrible point it was in service to, but Scooter Skeeter of all people made a much better version of this argument by acknowledging that the other axis of power wasn't "can make someone feel bad through mean tweets" but was instead "can inflict grievous personal violence on the aged billionaires who pay them for protection". I can buy some of these guys actually shooting someone, but the majority of these wannabe digital lordlings are going to end up following one of the many Roman Emperors of the 3rd century and get killed and replaced by their Praetorians.

[–] sc_griffith@awful.systems 7 points 3 days ago* (last edited 3 days ago)

the majority of these wannabe digital lordlings are going to end up following one of the many Roman Emperors of the 3rd century and get killed and replaced by their Praetorians.

this is a possibility lots of the prepper ultra rich are concerned with, yet I don't recall that I've ever heard the tech scummies mention it. they don't realize that their fantasized outcome is essentially identical to the prepper societal breakdown, because they don't think of it primarily as a collapse.

more generally, they seem to consider every event in the most narcissistic terms: outcomes are either extensions of their power and luxury to ever more limitless forms or vicious and unjustified leash jerking. there's a comedy of the idle rich aspect to the complacency and laziness of their dream making. imagine a boot stamping on a face, forever, between rounds at the 9th hole

[–] skillissuer@discuss.tchncs.de 5 points 3 days ago (1 children)

I can buy some of these guys actually shooting someone, but the majority of these wannabe digital lordlings are going to end up following one of the many Roman Emperors of the 3rd century and get killed and replaced by their Praetorians.

i think it'll turn out muchhh less dramatic. look up cryptobros, how many of them died at all, let alone this way? i only recall one ruja ignatova, bulgarian scammer whose disapperance might be connected to local mafia. but everyone else? mcaffee committed suicide, but that might be after he did his brain's own weight in bath salts. for some of them their motherfuckery caught up with them and are in prison (sbf, do kwon) but most of them walk freely and probably don't want to attract too much attention. what might happen, i guess, is that some of them will cheat one another out of money, status, influence, what have you, and the scammed ones will just slide into irrelevance. you know, to get a normal job, among normal people, and not raise suspicion

I'm probably being a bit hyperbolic, but I do want to clarify that the descent into violence and musical knife-chairs is what happens if they succeed at replacing or disempowering the State. The worst offenders going to prison and the rest quietly desisting is what happens when the State does something (literally anything, in fact. Tepid and halfhearted enforcement of existing laws was enough to meaningfully slow the rise of crypto) and they fail, but if they were to directly undermine that monopoly on violence I fully expect to see violence turned against them, probably at the hands of whatever agent they expected to use it on their behalf. In my mind this is the most dramatic possible conclusion of their complete lack of understanding of what they're actually trying to do, though it is certainly less likely than my earlier comment implied.

[–] Soyweiser@awful.systems 6 points 4 days ago* (last edited 4 days ago) (2 children)

That’s basically the entire problem in a nutshell.

I think a lot of these people are cunning, aka good at somewhat sociopathic short term plans and thinking, and they confuse this ability (and they survivor biassed success) for being good at actual planning (or just thinking that planning is worthless, after all move fast and break things (and never think about what you just said)). You don't have to actually have good plans if people think you have charisma/a magical money making ability (which needs more and more rigging of the casino to get money on the lot of risky bets to hope one big win pays for it all machine).

Doesn't help that some of them seem to either be on a lot of drugs, or have undiagnosed adhd. Unrelated, Musk wants to go into Fort Knox all of a sudden, because he saw a post on twitter which has convinced him 'they' stole the gold (my point here is that there is no way he was thinking about Knox at all before he randomly came across the tweet, the plan is crayons).

[–] skillissuer@discuss.tchncs.de 8 points 3 days ago* (last edited 3 days ago)

Unrelated, Musk wants to go into Fort Knox all of a sudden

you know, one of better models of schizophrenia we have looks like this: take a rat and put them on a schedule of heroic doses of PCP. after some time, a pattern of symptoms that looks a lot like schizophrenia develops even when off PCP. unlike with amphetamine, this is not only positive symptoms (like delusions and hallucinations) but also negative and cognitive symptoms (like flat affect, lack of motivation, asociality, problems with memory and attention). PCP touches a lot of things, but ketamine touches at least some of the same things that matter in this case (NMDA receptor). this residual effect is easy to notice even by, and among, recreational users of this class of compounds

richest man in the world grows schizo brain as a hobby, pillages government, threatens to destroy Lithuania

[–] YourNetworkIsHaunted@awful.systems 8 points 3 days ago (1 children)

I'm sorry 'they' did what? Everyone knows you can't rob Fort Knox. You have to buy up a significant fraction of the rest of the gold and then detonate a dirty bomb in Fort Knox to reduce the supply and- oh my God bitcoiners learned economics from Goldfinger.

[–] Soyweiser@awful.systems 8 points 3 days ago (3 children)

oh my God

Welcome to the horrible realization of the truth. All things the right understands comes from entertainment media. That is also why satire doesn't work, you need to have a deeper understanding of the world to understand the themes, else starship troopers is just about some hot people shooting bugs.

[–] aio@awful.systems 10 points 3 days ago (2 children)

ok i watched Starship Troopers for the first time this year and i gotta say a whole lot of that movie is in fact hot people shooting bugs

[–] Soyweiser@awful.systems 4 points 2 days ago

Yeah, I have reread the book last year. (Due to all the hot takes of people about the book in regards with Helldivers) and the movie is a lot better propaganda than the book (The middle, where they try to justify their world, drags on and on and is filled with strawmen and really weird moments. Esp the part where the main character, who isn't the sharpest tool in the shed, is told that he is smart enough to join the officers. You must be this short to enter)).

[–] BigMuffin69@awful.systems 5 points 3 days ago (1 children)

My life for super Earth 🫡

[–] Soyweiser@awful.systems 4 points 2 days ago* (last edited 2 days ago) (2 children)

Prff, like you would be part of the 20% that survives basic training. I know I wouldn't.

(So many people miss this little detail, or the detail that it is cheaper to send a human with a gun down to a planet to arm the nukes (sorry Hellbombs) than to put a remote detonator on the nukes, I assume you were not one of those people btw, it is just me gushing positively about the satire in the game (it is a good game) and sort of despairing about media literacy/attention spans).

load more comments (2 replies)
[–] dgerard@awful.systems 10 points 3 days ago

see also: Yudowsky has never consumed fiction targeted above middle school

[–] Illuminatus@mstdn.social 7 points 3 days ago (1 children)

@Soyweiser @YourNetworkIsHaunted
@cstross But it's not only that: satire can't reach the fash because they internalise and accept the monstruous, and thus, what is satirised to expose in ridicule its and their monstruosity and aberrant values for average, still decent and sane people, for them it is "Yes, this is what we want." It's never "if you can't tell it's satire it's bad satire", you can be as "subtle" as a kick in the face and they won't get it because it <is> what they want.

[–] Soyweiser@awful.systems 5 points 3 days ago

Certainly, for a lot of them it is even worse. See how the neo-nazis love American History X. (How do we stop John Connor from becoming a nazi, seems oddly relevant).

[–] maol@awful.systems 11 points 4 days ago* (last edited 4 days ago) (3 children)

An extremely scary survey: A poster advertising a survey on "Assisted Dying and the Emerging Role of AI"

Edit: have a féach at the survey here

[–] YourNetworkIsHaunted@awful.systems 7 points 3 days ago (1 children)

That was both horrible and also not what I expected. Like, they at least avoid the AI simulacra nonsense where you train an LLM on someone's Facebook page and ask it if they want to die when they end up in a coma or something, but they do ask about what are effectively the suicide booths from Futurama. Can't wait to see what kind of bullshit they try to make from the results!

[–] maol@awful.systems 5 points 3 days ago

They need an option for "very uncomfortable".

[–] zogwarg@awful.systems 9 points 4 days ago* (last edited 4 days ago)

Thank you for completing our survey! Your answer on question 3-b indicates that you are tired of Ursa Minor Beta, which according to our infallible model, indicates that you must be tired of life. Please enjoy our complementary lemon soaked paper napkins while we proceed to bringing you to the other side!

load more comments
view more: ‹ prev next ›