this post was submitted on 05 Apr 2024
29 points (100.0% liked)

TechTakes

1430 readers
117 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Architeuthis@awful.systems 6 points 7 months ago* (last edited 7 months ago)

Yet AI researcher Pablo Villalobos told the Journal that he believes that GPT-5 (OpenAI's next model) will require at least five times the training data of GPT-4.

I tried finding the non-layman's version of the reasoning for this assertion and it appears to be a very black box assessment, based on historical trends and some other similarly abstracted attempts at modelling dataset size vs model size.

This is EpochAI's whole thing apparently, not that there's necessarily anything wrong with that. I was just hoping for some insight into dataset length vs architecture and maybe the gossip on what's going on with the next batch of LLMs, like how it eventually came out that gpt4.x is mostly several gpt3.xs in a trench coat.