this post was submitted on 10 Sep 2024
308 points (94.0% liked)

PC Gaming

8607 readers
651 users here now

For PC gaming news and discussion. PCGamingWiki

Rules:

  1. Be Respectful.
  2. No Spam or Porn.
  3. No Advertising.
  4. No Memes.
  5. No Tech Support.
  6. No questions about buying/building computers.
  7. No game suggestions, friend requests, surveys, or begging.
  8. No Let's Plays, streams, highlight reels/montages, random videos or shorts.
  9. No off-topic posts/comments, within reason.
  10. Use the original source, no clickbait titles, no duplicates. (Submissions should be from the original source if possible, unless from paywalled or non-english sources. If the title is clickbait or lacks context you may lightly edit the title.)

founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] hoghammertroll@lemm.ee 3 points 2 months ago* (last edited 2 months ago)

You could also buy cheaper ‘lesser’ parts if you’re not interested in playing the top of the top new games.

This is a great point as well. I'm not huge into gaming myself, with the 3070 being mostly a move to futureproof my build for what few games I do play/may want to play. I actually use it more for transcoding my media collection and AI upscaling some of the older stuff that isn't available anywhere in higher quality than 480p. But for gaming, this thing will probably get me by for another 7-10 years.

My 12 year old laptop can run everything I play still with no problem

That's exactly why I don't understand the general pushback against the idea of "future-proofing" builds in the PC gaming community.

Like, I get it - even the best computer today isn't going to run the latest and greatest triple-A titles at 8K (or whatever the new gold standard resolution of tomorrow will be) on ultra settings at 240fps in 5-10 years from now. I also understand that it isn't wise to drop thousands on today's top of the line hardware under the premise that it'll be the last system you'll ever need.

But unless there's some major breakthrough in tech that completely obsoletes today's hardware into oblivion and upends the market to designing everything for way more powerful/different computers (which granted, is technically possible), or your goal is to run the latest and greatest at the best settings at ridiculously high frame rates and resolutions all the time, then a computer built with decent gear today is still gonna run decent for years to come. And you can typically piecemeal upgrades if necessary, at least with desktops, especially if you're starting with 'lesser' components.

I've been sitting on a new build (7800X3D/6700XT/32gb ram) for a few months now that's set to replace my current HTPC, but I haven't gotten around to putting it together because I've been working on some software to 1-click export all my software settings (win debloat + all program settings that I've manually configured over the years) so I can do a fresh install of Windows instead of just cloning the boot drive like the last time. Plus I'm lazy/distracted/busy with other shit.

But the HTPC that it's replacing? A 10-year old Optiplex 9020 with a 4th-gen Intel CPU + GTX 1650 and 16gb of ram. Runs well enough for what my family plays that it hasn't required this upgrade to be urgent (thankfully), and that's with my kids using it as (one of) their main gaming machines. If we were more hardcore into gaming, or just snobbish about graphics settings and framerates, then maybe the upgrade would be more of a necessity at an earlier point, but saying "there's no such thing as futureproofing a PC" is just the flip side of "spend a small fortune and you'll never have to upgrade again!!!1".