Reply 20 of 39, by pixel_workbench
- Rank
- Member
Garrett W wrote on 2020-04-14, 19:14:swaaye wrote on 2020-04-14, 16:31:Unreal engine D3D might have been troublesome, but people certainly were using it with these cards. That would be UT and Unreal engine licensees though, not so much Unreal the game.
Quake 3 is an interesting OpenGL example because you are probably seeing the best ATI could muster. It was critical to have good performance with Quake games to sell cards. Other OpenGL games may not even run right because they weren't in benchmarks. Like say Bioware OpenGL games....(but that was more for 8500/9x00).
No argument there, but does it make sense to benchmark it anymore now that we have the knowledge that it is very troublesome in D3D and not really representative of what these cards are capable? Both Unreal and UT are very important, but if Glide isn't used (or the software renderer for CPU benches), I think one of the alternative renderers available online (mainly UTGLR) should be used instead as not only does it run miles better, it's also as feature-rich as the Glide renderer.
And yes, you are completely right about those Bioware games, it's what I was thinking as well. Neverwinter Nights actually makes sense to test, it's really stuff like KOTOR that would be a little too demanding anyway. You could perhaps throw Call of Duty in there as well, another very popular OpenGL (and idtech3 derived even!)game, although this too is pushing it somewhat, being a late 2003 title.
Are there specific examples of what's broken in Unreal D3D renderer? I tested Unreal Gold updated to patch 226, side by side on a Voodoo3 in Glide mode and on my Radeon R9 380 in Windows 7 using D3D, and I could not see any difference that would indicate missing or broken rendering in D3D.