this post was submitted on 30 Jan 2024
71 points (94.9% liked)

ChatGPT

8937 readers
1 users here now

Unofficial ChatGPT community to discuss anything ChatGPT

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] amzd@kbin.social 4 points 9 months ago

If you have a GPU in your pc it’s almost always faster to just run your own llm locally. And you won’t have this issue. Search for ollama.