this post was submitted on 23 May 2024
949 points (100.0% liked)

TechTakes

1425 readers
180 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
 

Source

I see Google's deal with Reddit is going just great...

you are viewing a single comment's thread
view the rest of the comments
[–] Soyweiser@awful.systems 33 points 6 months ago (1 children)

I also wanted to post this post. But it is going to be very funny if it turns out that LLMs are partially very energy inefficient but very data efficient storage systems. Shannon would be pleased for us reaching the theoretical minimum of bits per char of words using AI.

[–] sinedpick@awful.systems 19 points 6 months ago* (last edited 6 months ago) (1 children)

huh, I looked into the LLM for compression thing and I found this survey CW: PDF which on the second page has a figure that says there were over 30k publications on using transformers for compression in 2023. Shannon must be so proud.

edit: never mind it's just publications on transformers, not compression. My brain is leaking through my ears.

[–] jonhendry@iosdev.space 11 points 6 months ago

@sinedpick

I wonder how many of those 30k were LLM-generated.