blank001 wrote:I know we like to be nostalgic about older cards, but I haven't seen any real evidence outside of the talked-to-death splinter cell games of real graphical incompatibility in the XP era. There are plenty of "modern" graphics cards that work perfectly fine in WXP 32bit.
Agreed with some qualifications. I will add on Morrowind specifically - it's much more CPU dependent than graphics card dependent. My FX 5800 Ultra can run it on full maximum settings, as long as it's paired with an appropriate CPU. Various benchmarks over the years have shown that it largely does not care what resolution it is run at, and many of the other settings also have similarly minimal impact on performance aside from draw distance - it just wants a powerful CPU. 😊
As far as other games that have issues, GoG maintains a compatibility alert/list for games broken by newer drivers; there are many that have this tag for nVidia, some for AMD. This isn't explicitly saying that "new GPUs can't work with old games" (because technically speaking there's no reason they can't), but if that new GPU requires a driver that's on the wrong side of such a breakage, it effectively restricts it from playing that old game. Generally speaking I would say (and have observed) that DX9+ titles shouldn't be a problem, but with games older than that (which still fit into "XP-era" depending on how broad of a brush you'd like to use - e.g. does it explicitly have had to be released after October 25, 2001, or does it include games that will run in XP overall?) it may not be quite so clear-cut.
swaaye wrote:With older games you should use SSAA or MSAA+TAA/AAA. The quality will be better. VSR/DSR are brute force methods to AA modern games that can't have those forced upon them.
Even with newer games I'd say this, excepting those which will break without some brute force method (and BOOO on developers for doing that). I've admittedly not gotten around to trying VSR with Halo, but I would assume it could be a solution for AA there. Also remember: nVidia generally won't expose SSAA in their drivers without a hack, whereas AMD has opened SSAA modes up on GCN cards in their newer drivers (previous to that they also required a hack).
Evert wrote:
I completely agree with this. Most XP-era games were designed to run at 1280x1024 and DSR/VSR are not really optimised or designed for those resolutions. If you have an over-powered DX10 card, you could pretty much run all your games at that resolution with SSAA (which is the best form of anti-aliasing).
I would disagree with "designed to run at 1280x1024" primarily because "XP-era" is far too broad an many games won't actually support 5:4. Generally speaking 5:4 is only an advantage for Vert- games that support it (as it will actually enlarge the viewport). Overall it's better to determine if the game is Hor+, Vert-, or Pixel Based, and what kind of resolutions it supports relative to what your monitor and system support, and go from there. With Hor+ switching to 5:4 is the worst choice, as it will produce the smallest possible viewport, and with pixel based games it largely shouldn't matter but as 5:4 is not all that common for modern displays, it will likely mean non-native AR on the monitor (and while I can hear someone saying "just pillar box it!" -> if the game is pixel based, set it to the proper AR for the monitor, and ideally the native resolution too). Depending on the game, higher resolutions also may not be the best choice, as it may introduce problems of its own (e.g. HUD renders too small or with issues, FOV is wrong, etc).
swaaye wrote:Yeah with NV you can use NVidia Inspector to force sparse grid supersampling. It works with any GeForce 8 or newer chip. It can be troublesome with some games. You enable it by forcing normal 4X MSAA and 4X SGSSAA in the Transparency AA box. I don't know why they don't support it officially. It could be because there would be an expectation of some guarantee of functionality and so they would have to validate tons of games and probably set up app compatibility for many of them.
FWIR they don't support it because there is some perception in management/PR that customers would complain/whine that enabling "only 4x AA" results in such awful performance, and it also would compete with (and largely negate) all of their whizbang proprietary AA methods (which are heavily and aggressively marketed as reasons why everyone and their grandmother must be buying the newest GeForce card every 6 months). Further FWIR AMD enabled SSAA on GCN mostly to thumb their noses at nVidia, just like their introduction of VSR more recently. IME the SSAA modes are perfectly workable even on relatively newer titles, for example I run Fallout 3 with 4x SSAA and still average 100 FPS or better, and it looks (imho) significantly sharper than 8x MSAA or any of the nVidia "enhanced" modes.
The Geforce cards also support ordered grid SSAA modes. These are less troublesome but also less effective (like VSR/DSR). GeForce 256-7 support some levels of OG SSAA officially. GF3-7 have interesting hybrid modes.
I like the xS AA "hybrid" modes on GeForce FX/6 - they tend to produce pretty good pictures, and the performance hit isn't horrible in many games. ATi offered similar modes for first-gen CrossFire as "SuperAA" - they're higher levels of AA than xS, and accordingly look somewhat better (they also run much better because there's two GPUs behind it). Shame this kind of middle-ground functionality went away in lieu of acronym-of-the-month feature bloat.
Speaking of weird AA modes on nVidia cards - some of the Quadro cards will support 16x OG MSAA as well. The performance hit is *significant* and the IQ improvements over 8x are debatable, but in very old/low resolution games it may have some utility. AFAIK this can be forced available on some GeForce cards too.