First post, by NamelessPlayer
- Rank
- Member
I can't help but ask the title question in the wake of today's egregiously-priced GeForce RTX 20x0 cards ($1200 for an RTX 2080 Ti where a Titan X would've been, only for the Titan V to go all the way to $3,000), the sudden spike in resale prices on old 3dfx cards, and people paying $600+ for Picasso IV RTG cards that only work at their fullest in Amiga 3000/4000 systems.
Professional cards are one thing (where we'd start looking at 3DLabs and SGI and all that, well beyond what was practical for home computing), but I still think back to a decade ago for consumer cards, where it seemed like $500 was all you needed to be top dog for gaming performance - first with the Radeon 9800 XT, then with the GeForce 6800 Ultra when NVIDIA came back swinging hard after the FX fiasco. My memory's too faint on cards before that point, though I'm pretty sure 3dfx never charged anywhere near $500. (Well, the Voodoo5 6000 may have been a different story had it made it to market...)
But now? $500 feels like mid-range at best.
Maybe it's just that the overall standard of performance has come a long way, kinda like with cars over the decades, but I can't help but feel miffed at how the generation of cards that I was hoping to replace my GTX 980 with is now so much more expensive than its predecessors that I'll have to hold off upgrading even longer than planned, all for the sake of VR performance that does not benefit at all from SLI/CrossFire for the most part.
Still, I wonder if we're paying more for GPUs now than we were a decade or two ago, even accounting for inflation. Anyone got any old PC parts ads/catalogues/etc.?