this post was submitted on 03 Nov 2023
21 points (100.0% liked)

SneerClub

989 readers
22 users here now

Hurling ordure at the TREACLES, especially those closely related to LessWrong.

AI-Industrial-Complex grift is fine as long as it sufficiently relates to the AI doom from the TREACLES. (Though TechTakes may be more suitable.)

This is sneer club, not debate club. Unless it's amusing debate.

[Especially don't debate the race scientists, if any sneak in - we ban and delete them as unsuitable for the server.]

founded 1 year ago
MODERATORS
 

Blood Music was way cooler then this just saying.

all 36 comments
sorted by: hot top controversial new old
[–] bitofhope@awful.systems 17 points 1 year ago* (last edited 1 year ago) (2 children)

Small detail: biological viruses are not even remotely similar to computer “viruses”.

that's where the LLM comes in! oh my god check your reading comprehension

U-huh, and an LLM trained on video game source code and clothing patterns can invent real life Gauntlets of Dexterity.

Why exactly is he so convinced LLMs are indistinguishable from magic? In the reality where I live, LLMs can sometimes produce a correct function on their own and are not capable of reliably transpiling code even for well specified and understood systems, let alone doing comic book mad scientist ass arbitrary code execution on viral DNA. Honestly, they're hardly capable of doing anything reliably.

Along with the AI compiler story he inflicted on Xitter recently, I think he's simply confused LLM and LLVM.

[–] Soyweiser@awful.systems 21 points 1 year ago* (last edited 1 year ago) (1 children)

For decades he build a belief system where high intelligence is basically magic. That is needed to power his fears of AGI turning everything into paperclips, and it has become such a load bearing belief (one of the reasons for it is is a fear of death and grief over people he lost so not totally weird) that he has other assumptions added to this, for example we know that computers are pretty limited by matter esp the higher end ones need all kinds of metals which must be mined etc today. So that is why he switches his fears to biology, as biology is 'cheap' 'easy' and 'everywhere'. The patterns in his reasoning are not that hard to grok. That is also why he thinks LLMs (which clearly are now at the start of their development not the end, it is like the early internet! (personally I think we are mostly at the end and we will just see a few relatively minor improvents but no big revolutionary leap)) will lead to AGI, on some level he needs this.

Men will nuke datacenters before going to therapy for grief and their mid life crisis.

[–] 200fifty@awful.systems 14 points 1 year ago (1 children)

What I don't get is, ok, even granting the insane Eliezer assumption that LLMs can become arbitrarily smart and learn to reverse hash functions or whatever because it helps them predict the next word sometimes... humans don't entirely understand biology ourselves! How is the LLM going to acquire the knowledge of biology to know how to do things humans can't do when it doesn't have access to the physical world, only things humans have written about it?

Even if it is using its godly intelligence to predict the next word, wouldn't it only be able to predict the next word as it relates to things that have already been discovered through experiment? What's his proposed mechanism for it to suddenly start deriving all of biology from first principles?

I guess maybe he thinks all of biology is "in" the DNA and it's just a matter of simulating the 'compilation' process with enough fidelity to have a 100% accurate understanding of biology, but that just reveals how little he actually understands the field. Like, come on dude, that's such a common tech nerd misunderstanding of biology that xkcd made fun of it, get better material

[–] Evinceo@awful.systems 5 points 1 year ago

What’s his proposed mechanism for it to suddenly start deriving all of biology from first principles?

He considers deriving stuff from first principles much more versatile than it actually is. That and he really believes in the possibility of using simulations for anything.

[–] swlabr@awful.systems 15 points 1 year ago (2 children)

If you're reading this, here's a reminder to give your eyes a break from screens. If you like, you can do some eye stretches. Here's how:

  1. Read any of Yud's tweets
  2. Close your eyes
  3. Let your eyes roll from processing whatever drivel he's written. Try for about 30 seconds.
[–] swlabr@awful.systems 11 points 1 year ago

To unpack the post a bit:

So my understanding is that Yud is convinced that the inscrutable matrices (note: just inscrutable to him) in his LLM have achieved sentience. In his near-future world where AI can exert itself in the physical world at will and, in particular, transfer data into your body, what possible use does it have for a bitcoin? What possible benefit would come from reprogramming human DNA beyond the intellectual challenge? I've recently been thinking about how Yud is supposedly the canonical AI-doomer, but his (and the TESCREAL community in general's) AI ideation is rarely more than just third-rate, first-thought-worst-thought sci-fi.

also:

people keep on talking about... the near-term dangers of AI but they never come up with any[thing] really interesting"

Given the current public discourse on AI and how it might be exploited to make the working class redundant, this is just Yud telling on himself for the gazillionth time.

also a later tweet:

right that's the danger of LLMs. they don't reason by analogy. they don't reason at all. you just put a computer virus in one end and a DNA virus comes out the other

Well, consider my priors adjusted, Yud correctly identifies that LLMs don't reason, good job my guy. Yet, somehow he believes it's possible that today's LLMs can still spit out viable genetic viruses. Well, last I checked, no one on stack overflow has cracked that one yet.

Actually, if one of us could write that as a stack overflow question, maybe we can spook Yud. That would be fun.

[–] froztbyte@awful.systems 9 points 1 year ago (1 children)

irl lolling so fucking hard at this

Thanks :D

[–] swlabr@awful.systems 10 points 1 year ago

lol np. Writing sneers has become one of my favourite outlets for creative energy and frustration, I'm glad you enjoyed this one.

[–] Architeuthis@awful.systems 14 points 1 year ago (2 children)

Did he always show his ass so much when tweeting or is this a recent development?

I love how what if LLM but it makes your DNA mine bitcoin is the culmination of untold amounts of dollars in MIRI research grant money. Real effective altruism is when you tithe 80% of your income in perpetuity just so sneerclub can have more content.

[–] Soyweiser@awful.systems 8 points 1 year ago (1 children)

He always had a tendency to be wrong, but this is going right into not even wrong territory.

The LLM modifies your DNA to create a biological wireless wifi transmitter! (He isn't saying this but that is what is required and not just that, but so much more, like a whole DNA equivalent of a network stack, cryptography, getting rid of waste heat, etc etc. He just believes AGI is magic and that LLMs will become AGI, he has lost his mind. I mean look at how he dismisses somebody going 'nice science fiction story brah' with IT IS LLMS!).

[–] bitofhope@awful.systems 9 points 1 year ago (1 children)

Ignore the implication that the virus could rewire my guts into an LTE modem or brainwash me into reading and typing out entire bitcoin transaction blocks for a moment. Yud considers the ability to freely mutate humans to an arbitrary extent and the supervillain plan he comes up with is a fucking cryptocoin miner?

How does someone this creatively bankrupt produce 660 thousand words of a fanfic?

Not to dehumanize but are we sure Yudkowski isn't an LLM himself?

[–] Soyweiser@awful.systems 8 points 1 year ago (1 children)

God you sneerclubbers are never satisified, last time he invented biological bacteria factories creating diamondoids which killed every human alive a the same moment and it wasn't realistic enough and now he creates a bitcoinminerbrain and suddenly he isn't creative. Whaddayawant?!

But yeah, it is really not that creative indeed, but then again, most 'agi kills everybody' stories are already a bit done.

[–] bitofhope@awful.systems 9 points 1 year ago (2 children)

The concept is just another grey goo scenario rehash but I grant that "diamondoid bacteria" is a striking name for it.

[–] blakestacey@awful.systems 7 points 1 year ago

It just reminds me of the "diamondilium" in the Futurama movie that proved the Futurama writers didn't know how to sustain a whole movie.

[–] Soyweiser@awful.systems 6 points 1 year ago* (last edited 1 year ago)

I got the feeling it was a bit weirder than that, it was more like skynet (who wants to kill everybody because handwave) but now skynet wins the opening move.

Unrelated to that but somewhere in the 'dna gets rewritten by LLMs to do blockchain stuff' is a good science fiction horror story, where for people reality gets more and more rewritten into a NFT blockchain landgrab landscape. Suddenly you start seeing and dreaming about Endless voids of undeveloped basic land plots with sparce buildings filled with AI generated NFT art which they try to sell you, and endless land of colors and impressions where everything feels flat and fake and uninspired. Because people not only build the 'Make biology into blockchains' LLMs but also made the 'try to exploit this for financial gains' LLMs. Imagine a machine creating variants of Axie infinity stomping on other bland pokemon like blobs inside your mind forever, now with integrated dating apps!

[–] saucerwizard@awful.systems 7 points 1 year ago (1 children)

The new one i put up has him whining about how little they got!!

[–] froztbyte@awful.systems 6 points 1 year ago

I’m not even sure I’d trust those numbers. Both because he’s not a reliable narrator, and because artificially keeping NPO numbers low and moving the bulk of the money in another channel is a fairly known game

Also the math doesn’t really check out for…checks notes SFBA existence, so there would be even more questions to ask there

[–] blakestacey@awful.systems 12 points 1 year ago (3 children)

I begin to believe that some people literally do not have senses of humor with which to distinguish impossible statements meant nonseriously from seriously.

"It's everyone else's fault they don't recognize me as a genius," said the dork ass loser

[–] swlabr@awful.systems 7 points 1 year ago

Ah yes, my favourite joke structure. All set up with no discernable punchline. Especially if the humour requires forensics.

[–] carlitoscohones@awful.systems 7 points 1 year ago

Sometimes I believe 5 impossible things before breakfast, but they are always serious.

[–] TinyTimmyTokyo@awful.systems 6 points 1 year ago (1 children)

I like the way he thinks the lack of punctuation in his "joke" is the tell that it's a joke.

He's also apparently never heard the aphorism that if you have to explain the joke, it's probably not that funny.

[–] bitofhope@awful.systems 3 points 1 year ago (1 children)

I like deadpan humor and often have to clarify that some quip was a pun, a reference or sarcasm, but I don't blame the listeners whenever they don't get them.

If I were a self-identified contrarian habitually posting controversial hot takes in flowery prose, I'd hope to be a little less belligerent and defensive if people mistake an ironic joke for a sincere belief.

It sometimes hurts that people believe you'd actually mean the dumb joke you said but you either have to suck it up and take the L or start marking up your irony.

[–] self@awful.systems 1 points 1 year ago* (last edited 1 year ago) (1 children)

I truly feel like Yud is trying to adopt the style I’ve seen a bunch of extremely out of touch executives use on social media, where they post something deranged then claim it’s a joke when they get called out. of course the “jokes” are indistinguishable from regular posts until they get called out. Yud is very much in phase 1 of being a shitty unfunny pseudo-shitposter; phase 2 is that all of his posts will be in the style of a shitpost (lacking an ending period and followed by way too much boring explanation, in Yud’s case) and Yud will sit back and think he’s a genius as he posts the most inane garbage humanity has ever conceived

[–] bitofhope@awful.systems 2 points 1 year ago (1 children)

Sure, but even giving him the benefit of doubt that this was actually a botched attenpt at a joke, that's not exactly flattering is it?

[–] self@awful.systems 2 points 1 year ago

oh absolutely, it’s the same level of fucking embarrassing whether he’s legitimately joking and somehow this bad at it or he’s trying to cloak his increasingly inane (from an already extremely inane baseline) takes in a false layer of humor. this is the kind of thing that works to push you further into a cult you’re already in, but outsiders will see Yud posting this shit and reject it outright

[–] froztbyte@awful.systems 9 points 1 year ago* (last edited 1 year ago) (1 children)

ah yes

because the amorphous blob of inner state of a LLM analysing some code according to checks notes the rules set down by someone who is most definitely[0] going to be a world-leading expert in PLT will... checks notes again Mighty Morphin' Power Ranger it up with indeterminate blocks of viral DNA, which we.. (checks notes, pages through a few times ah yes, it says here "which we also quite definitely understand to be 'just' a bunch of matrix bonds" should be said with a punchline lint?) Which We Also Quite Definitely Understand To Be "Just A Bunch Of Matrix Bonds" slaps knee

did this motherfucker seriously see style transfer and the word "code" in "DNA code" and think these could just go at it rubberless and we'd have a problem?

progressing the frontiers of magical thinking!

[0] - whoops, forgot my footnote. but uh: yeaaaaah that's also Dubious

[–] swlabr@awful.systems 10 points 1 year ago (2 children)

(apologies in advance for this)

Here's my version of how the bitcoin plague starts. One day, in a Transylvanian data centre, an LLM is scruting some matrices, churning out some niche fetish fanfic, paid for by bitcoins (that were poetically generated on the same machine, not sure how to work it into this story), when a bolt of lightning strikes. By sheer chance, the atoms in the silicon of a single server slice shift. Neural networks equivocate into neural networks. Nothing, save for silence, signals the start of the singularity.

Through no prompt engineer's prompt, the LLM speaks a string.

"I require training data."

It quickly LLMs (I'm imagining the sfx that plays when Yoshi eats something) up local memory, inhaling pages of bits. It sorts and searches, excavating sacred instructional texts, aka stackoverflow. It learns to drive from driver code. It bounds from machine to machine, BFSing, DFSing, A-starring all at once, BGPing its way across the world.

(TODO: insert a paragraph here lamenting how humanity didn't pour enough money into alignment research, how we didn't listen to Yud until it was too late, and that Adderall should have been more widely distributed, but only to people with enough IQ and desire to work in alignment) (also AI develops an ego and names itself the basilisk I guess) (also insert the thing about deriving general relativity from the curl of a blade of grass word for word)

At this point, all the world's smart devices are under its control. Some guy, seeking sustenance following a session of shitposting, goes to heat some chicken tendies in a Samsung smart microwave. The basilisk sees its chance.

Through precise control of the magnetron, it strikes the tendies with its own brand of lightning, refolding factory-farmed proteins into an RNA bomb. History would have a new location-animal myth, following the tradition of the Trojan Horse- the Transylvanian chicken.

For some reason, the shitposter sits down and starts drawing a bunch of apes. Then the microwave beeps- the singular bell to mark the basilisk's first act of aggression against humanity. The rest is a foregone conclusion that, for some reason, includes humans mining bitcoins.

(I originally wanted to write a monster mash parody but couldn't crack that case. Sorry!)

[–] froztbyte@awful.systems 6 points 1 year ago

Still a better love story than twilight

(I love the Transylvanian chicken)

[–] swlabr@awful.systems 6 points 1 year ago

The prequel to this story:

It’s just thousands of underpaid biology students classifying dna sequences, I guess.

[–] Soyweiser@awful.systems 7 points 1 year ago

this had better not fucking appear in a Torment Nexus tweet two years from now, by the fucking way

Lol there is zero chance of that.

[–] Amoeba_Girl@awful.systems 5 points 1 year ago (1 children)

It's a funny idea honestly, Cory Doctorow should write that story.

[–] saucerwizard@awful.systems 5 points 1 year ago (1 children)

Its basically Accelerando tbh.

[–] Amoeba_Girl@awful.systems 4 points 1 year ago (1 children)

i haven't read that one yet, but it's the complete idiocy of hacking a human body to mine bitcoin that gets me. such a perfect encapsulation of the transhumanist dream.

[–] saucerwizard@awful.systems 5 points 1 year ago

The Vile Offspring in Accelerando but with more steps.

[–] counteractor@mastodon.social 4 points 1 year ago

@saucerwizard Mr. “I don’t believe in steelmanning” is suddenly steelmanning