this post was submitted on 01 Dec 2024
45 points (88.1% liked)

Futurology

1886 readers
54 users here now

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] pennomi@lemmy.world 3 points 1 month ago (2 children)

A lot of the smaller LLMs don’t require GPU at all - they run just fine on a normal consumer CPU.

[–] copygirl@lemmy.blahaj.zone 3 points 1 month ago (1 children)

Wouldn't running on a CPU (while possible) make it less energy efficient, though?

[–] pennomi@lemmy.world 3 points 1 month ago

It depends. A lot of LLMs are memory-constrained. If you’re constantly thrashing the GPU memory it can be both slower and less efficient.

[–] DavidGarcia@feddit.nl 1 points 1 month ago

yeah but 10x slower, at speeds that just don't work for many use cases. When you compare energy consumption per token, there isn't much difference.