this post was submitted on 04 Feb 2025
99 points (99.0% liked)

PC Gaming

10123 readers
373 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] GrindingGears@lemmy.ca 1 points 1 month ago* (last edited 1 month ago) (3 children)

Not very many people had a dedicated GPU in the 90s and 2000s. And there's no way the failure rate was higher, not even Limewire could melt down the family PC back then. It sure gave it the college try, but it was usually fixable. The biggest failures, bar none, were HD or media drives.

[–] Jimmycakes@lemmy.world 2 points 1 month ago

We all did they used to cost like 60 bucks

[–] TacoSocks@infosec.pub 2 points 1 month ago

Dedicated GPUs were pretty common in the 2000s, they were required for most games, unlike the 90s where it was an unstandardized wild west. The failure rate had to be higher, I know I had 3 cards die with less than 2 years use on each card in the 2000s. Cases back then had terrible airflow and graphic demands jumped quickly.

[–] Rekall_Incorporated@lemm.ee 1 points 1 month ago

I was referring to PC components in general.