First post, by infiniteclouds
It seems many of the games I play from 2016 onward have some of the worst, horrific aliasing imaginable. Sure, older games benefited significantly from AA but nowadays a scene without AA isn't 'jaggies' it is a shimmering, vomit-inducing, shim, shim, shimmery eye massacre. With deferred shading/rendering it seems like the AA options aren't as good, either. They are either 1) Blur the crap out of everything or 2) Higher resolutions or downsampling/scaling.
I'm still using my 4GB 760GTX from 2014, it was $310 at the time. Before that it was a GTX 275 CO-OP PhysX edition for $350 in 2009 and a $320 7900GT in 2006 which had to be replaced by a 8800GTS when it died just over a year later, costing me the most I ever spent on a GPU - $380. This was the general price range I felt comfortable investing in my graphics card and I always felt like it was a huge leap when I'm upgrade 3-4 years later. Is it my imagination or are they expecting significantly more money for the same leaps in performance, or even less? ATI's reveal at CES was hugely disappointing to me because I've been wanting to jump ship from NVIDIA for a while but I don't feel like they're giving me a better alternative, either.
The asking price for cards that are already several years old and newer cards that are drawing comparisons to them -- I'm supposed to be excited when a 2019 card touts 20% better performance over a card from 2017? - seems ridiculous.
Would like to hear other people's thoughts since I am admit I'm pretty ignorant, I've been out of the loop having not bought hardware in 5 years now and with DRM the way it is I feel like moving back to console at least until they go all digital as well.