this post was submitted on 07 Jan 2025
157 points (95.9% liked)

Games

33095 readers
1793 users here now

Welcome to the largest gaming community on Lemmy! Discussion for all kinds of games. Video games, tabletop games, card games etc.

Weekly Threads:

What Are You Playing?

The Weekly Discussion Topic

Rules:

  1. Submissions have to be related to games

  2. No bigotry or harassment, be civil

  3. No excessive self-promotion

  4. Stay on-topic; no memes, funny videos, giveaways, reposts, or low-effort posts

  5. Mark Spoilers and NSFW

  6. No linking to piracy

More information about the community rules can be found here.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] KamikazeRusher@lemm.ee 68 points 2 days ago (5 children)

Maybe I’m stuck in the last decade, but these prices seem insane. I know we’ve yet to see what a 5050 (lol) or 5060 would be capable of or its price point. However launching at $549 as your lowest card feels like a significant amount of the consumer base won’t be able to buy any of these.

[–] Strider@lemmy.world 1 points 11 hours ago

Don't forget to mention the huge wattage.

More performance for me is more identical fps at the same amount of power.

[–] MDCCCLV@lemmy.ca 3 points 1 day ago (1 children)

You have to keep inflation in mind. 550 would be 450 2019 dollars.

[–] KamikazeRusher@lemm.ee 1 points 21 hours ago

Yeah, I keep forgetting how much time has passed.

Bought my first GPU, an R9 Fury X, for MSRP when it launched. The R9 300 series and GTX 900 series seemed fairly priced then (aside from the Titan X). Bought another for Crossfire and mining, holding on until I upgraded to a 7800 XT.

Comparing prices, all but the 5090 are within $150 of each other when accounting for inflation. The 5090 is stupid expensive. A $150 increase in price over a 10-year period probably isn’t that bad.

I’m still gonna complain about it and embrace my inner “old man yells at prices” though.

[–] Stovetop@lemmy.world 28 points 2 days ago* (last edited 2 days ago) (2 children)

Sadly I think this is the new normal. You could buy a decent GPU, or you could buy an entire game console. Unless you have some other reason to need a strong PC, it just doesn't seem worth the investment.

At least Intel are trying to keep their prices low. Until they either catch on, in which case they'll raise prices to match, or they fade out and leave everyone with unsupported hardware.

[–] MDCCCLV@lemmy.ca 2 points 1 day ago

As always, buying a used previous gen flagship is the best value.

[–] GoodEye8@lemm.ee 19 points 2 days ago (1 children)

Actually AMD has said they're ditching their high end options and will also focus on budget and midrange cards. AMD has also promised better raytracing performance (compared to their older cards) so I don't think it will be the new norm if AMD also prices their cards competitively to Intel. The high end cards will be overpriced as it seems like the target audience doesn't care that they're paying shitton of money. But budget and midrange options might slip away from Nvidia and get cheaper, especially if the upscaler crutch breaks and devs have to start doing actual optimizations for their games.

[–] moody@lemmings.world 12 points 2 days ago (1 children)

Actually AMD has said they’re ditching their high end options

Which means there's no more competition in the high-end range. AMD was lagging behind Nvidia in terms of pure performance, but the price/performance ratio was better. Now they've given up a segment of the market, and consumers lose out in the process.

[–] GoodEye8@lemm.ee 19 points 2 days ago (1 children)

the high end crowd showed there's no price competition, there's only performance competition and they're willing to pay whatever to get the latest and greatest. Nvidia isn't putting a 2k pricetag on the top of the line card because it's worth that much, they're putting that pricetag because they know the high end crowd will buy it anyway. The high end crowd has caused this situation.

You call that a loss for the consumers, I'd say it's a positive. The high end cards make up like 15% (and I'm probably being generous here) of the market. AMD dropping the high and focusing on mid-range and budget cards which is much more beneficial for most users. Budget and mid-range cards make up the majority of the PC users. If the mid-range and budget cards are affordable that's much more worthwhile to most people than having high end cards "affordable".

[–] moody@lemmings.world 7 points 2 days ago (1 children)

But they've been selling mid-range and budget GPUs all this time. They're not adding to the existing competition there, because they already have a share of that market. What they're doing is pulling out of a segment where there was (a bit of) competition, leaving a monopoly behind. If they do that, we can only hope that Intel puts out high-end GPUs to compete in that market, otherwise it's Nvidia or nothing.

Nvidia already had the biggest share of the high-end market, but now they're the only player.

[–] GoodEye8@lemm.ee 8 points 2 days ago (1 children)

It's already Nvidia or nothing. There's no point fighting with Nvidia in the high end corner because unless you can beat Nvidia in performance there's no winning with the high end cards. People who buy high end cards don't care about a slightly worse and slightly cheaper card because they've already chosen to pay premium price for premium product. They want the best performance, not the best bang for the buck. The people who want the most bang for the buck at the high end are a minority of a minority.

But on the other hand, by dropping high end cards AMD can focus more on making their budget and mid-range cards better instead of diverting some of their focus on the high end cards that won't sell anyway. It increases competition in the budget and mid-range section and mid-range absolutely needs stronger competition from AMD because Nvidia is slowly killing mid-range cards as well.

[–] Naz@sh.itjust.works 1 points 1 day ago (1 children)

TIL, I'm a minority of a minority.

Overclocked a $800 AMD 7900XTX to 3.4 GHz core with +15% overvolt (1.35V), total power draw of 470W @86°C hotspot temp under 100% fan duty cycle.

Matches the 3DMark score in Time Spy for an RTX 4090D almost to the number.

63 FPS @ 1440p Ray Tracing: Ultra (Path Tracing On) in CP2077

[–] GoodEye8@lemm.ee 1 points 1 day ago

Steam hardware survey puts 4090 at 1.16% and 7900xtx at 0.54%. That means if we look at only the 4090s and 7900xtx-s then just between the two of them the 7900xtx makes up about a third of the cards. So yeah, you are a minority of a minority.

As for this number jargon. I'm not exactly sure what you're trying to prove here but I'm sure you're comparing an overclocked card to a stock card and if you're saying it's matching the 4090D then you're not actually matching the 4090. 4090D is weaker than 4090, depending on the benchmark it ranges between 5% weaker to 30% weaker. If you were trying to prove that AMD cards can be as good as Nvidia cards then you've proven that even with overclocking the top of the line AMD card can't beat a stock top of the line Nvidia card.

[–] simple@lemm.ee 21 points 2 days ago (2 children)

They'll sell out anyways due to lack of good competition. Intel is getting there but still have driver issues, AMD didn't announce their GPU prices yet but their entire strategy is following Nvidia and lowering the price by 10% or something.

[–] sturmblast@lemmy.world 2 points 1 day ago (1 children)
[–] SaltySalamander@fedia.io 1 points 20 hours ago

AMD hasn't been truly competitive with nVidia in quite a long time.

[–] TonyOstrich@lemmy.world 1 points 2 days ago (4 children)

Weird completely unrelated question. Do you have any idea why you write "Anyway" as "Anyways"?

It's not just you, it's a lot of people, but unlike most grammar/word modifications it doesn't really make sense to me. Most of the time the modification shortens the word in some way rather than lengthening it. I could be wrong, but I don't remember people writing or saying "anyway" with an added "s" in anyway but ironically 10-15 years ago, and I'm curious where it may be coming from.

[–] emeralddawn45@discuss.tchncs.de 1 points 20 hours ago (1 children)

https://grammarist.com/usage/anyways/

Although considered informal, anyways is not wrong. In fact, there is much precedent in English for the adverbial -s suffix, which was common in Old and Middle English and survives today in words such as towards, once, always, and unawares. But while these words survive from a period of English in which the adverbial -s was common, anyways is a modern construction (though it is now several centuries old).

[–] TonyOstrich@lemmy.world 1 points 19 hours ago

Schrödinger's word. Both new and old, lol

[–] Blisterexe@lemmy.zip 4 points 1 day ago

I also write anyways that way, and so does everyone I know, I think it's a regional thing

[–] simple@lemm.ee 3 points 2 days ago (1 children)

I guess I'm used to saying it since I spent a long time not knowing it's the wrong pronunciation for it.

[–] TonyOstrich@lemmy.world 1 points 2 days ago

Interesting. Thanks.

[–] Mac@mander.xyz 2 points 2 days ago

Don't pick on the parseltongue.

[–] tburkhol@lemmy.world 8 points 2 days ago

So much of nvidia's revenue is now datacenters, I wonder if they even care about consumer sales. Like their consumer level cards are more of an advertising afterthought than actual products.