VOGONS


Geforce 2 GTS vs Radeon 7500

Topic actions

Reply 20 of 39, by pixel_workbench

User metadata
Rank Member
Rank
Member
Garrett W wrote on 2020-04-14, 19:14:
swaaye wrote on 2020-04-14, 16:31:

Unreal engine D3D might have been troublesome, but people certainly were using it with these cards. That would be UT and Unreal engine licensees though, not so much Unreal the game.

Quake 3 is an interesting OpenGL example because you are probably seeing the best ATI could muster. It was critical to have good performance with Quake games to sell cards. Other OpenGL games may not even run right because they weren't in benchmarks. Like say Bioware OpenGL games....(but that was more for 8500/9x00).

No argument there, but does it make sense to benchmark it anymore now that we have the knowledge that it is very troublesome in D3D and not really representative of what these cards are capable? Both Unreal and UT are very important, but if Glide isn't used (or the software renderer for CPU benches), I think one of the alternative renderers available online (mainly UTGLR) should be used instead as not only does it run miles better, it's also as feature-rich as the Glide renderer.

And yes, you are completely right about those Bioware games, it's what I was thinking as well. Neverwinter Nights actually makes sense to test, it's really stuff like KOTOR that would be a little too demanding anyway. You could perhaps throw Call of Duty in there as well, another very popular OpenGL (and idtech3 derived even!)game, although this too is pushing it somewhat, being a late 2003 title.

Are there specific examples of what's broken in Unreal D3D renderer? I tested Unreal Gold updated to patch 226, side by side on a Voodoo3 in Glide mode and on my Radeon R9 380 in Windows 7 using D3D, and I could not see any difference that would indicate missing or broken rendering in D3D.

My Videos | Website
P2 400 unlocked / Asus P3B-F / Voodoo3 3k / MX300 + YMF718

Reply 21 of 39, by maximus

User metadata
Rank Member
Rank
Member
pixel_workbench wrote on 2020-04-15, 17:10:

Are there specific examples of what's broken in Unreal D3D renderer? I tested Unreal Gold updated to patch 226, side by side on a Voodoo3 in Glide mode and on my Radeon R9 380 in Windows 7 using D3D, and I could not see any difference that would indicate missing or broken rendering in D3D.

I haven't noticed anything that's broken per se, but Unreal D3D is pretty slow on old cards, especially with detail textures enabled. The hit for enabling 32-bit color is also larger than most other games and demos. Being resource-intensive is part of what makes it a good benchmark, though.

I imagine the slowdowns wouldn't be noticeable on a modern system. Also, I think the Oldunreal patch (227i) improves things quite a bit with its DX8, DX9, and OpenGL renderers.

PCGames9505

Reply 22 of 39, by leileilol

User metadata
Rank l33t++
Rank
l33t++

The D3D driver adds even more layers of detail textures that the Glide driver doesn't do, so the fillrate hit is far worse and is not truly of feature parity because of it, and thus, dead easy to do bullshit "yeh1!! 3dfx VooDoo Graphic Superior Best !" claims.
I don't think comparing the two cards with old 98/99 multiplatform games is ideal for DX7 hardware. Forsaken is terrible. MDK2's better, but is better to bench with the latest patch since you can disable sound (which can bottleneck hard with certain cards like a SBLive) and use display lists (slightly faster)

Also a reminder of the Geforce2's blurry VGA output which goes against its practicality of using high resolutions 😀

apsosig.png
long live PCem

Reply 24 of 39, by maximus

User metadata
Rank Member
Rank
Member
386SX wrote on 2020-04-16, 07:19:

Why not a Kyro2 card? Back then I remember it was quite a shock to see it competing with those mentioned cards.

Kyro II is actually super solid. Its performance is not quite up to GeForce2 levels, but the latest drivers are surprisingly stable and compatible. They even support table fog! 😁

The main downside I noticed is the VGA signal is not great, at least on the 3D Prophet 4500 I messed with. Also those card are pretty hard to come by.

PCGames9505

Reply 25 of 39, by maximus

User metadata
Rank Member
Rank
Member
leileilol wrote on 2020-04-16, 00:16:

Also a reminder of the Geforce2's blurry VGA output which goes against its practicality of using high resolutions 😀

There seems to be a lot of variability. I have a 3D Prophet II Ti and a couple of unbranded (probably Dell) GeForce2 GTS cards where the VGA output is perfectly bright and sharp. Lots of GeForce2 Ultras seem to have the wavy horizontal line problem, though, and I had an ASUS GeForce2 GTS at one point that I think was not great.

PCGames9505

Reply 26 of 39, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

The blurry output is easily fixable: http://web.tiscalinet.it/creeping_death/guide/aamodifica.htm

1 x PLCC-68 / 2 x PGA132 / 5 x Skt 3 / 9 x Skt 7 / 12 x SS7 / 1 x Skt 8 / 14 x Slot 1 / 5 x Slot A
5 x Skt 370 / 8 x Skt A / 2 x Skt 478 / 2 x Skt 754 / 3 x Skt 939 / 7 x LGA775 / 1 x LGA1155
Current PC: Ryzen 7 5800X3D
Backup PC: Core i7 7700k

Reply 27 of 39, by candle_86

User metadata
Rank l33t
Rank
l33t

It's an AMD 750 Irongate with a via 686a southbridge. My plans totally changed though, I saw an 8500le on Facebook marketplace for $10 and am now using that. I put the GTS in my k6-3

Reply 28 of 39, by 386SX

User metadata
Rank l33t
Rank
l33t
maximus wrote on 2020-04-16, 18:34:
386SX wrote on 2020-04-16, 07:19:

Why not a Kyro2 card? Back then I remember it was quite a shock to see it competing with those mentioned cards.

Kyro II is actually super solid. Its performance is not quite up to GeForce2 levels, but the latest drivers are surprisingly stable and compatible. They even support table fog! 😁

The main downside I noticed is the VGA signal is not great, at least on the 3D Prophet 4500 I messed with. Also those card are pretty hard to come by.

Yes the vga output wasn't the best, I can't say for a PCB components choice or for a GPU problem itself. The only problem with that card I remember needed a fast cpu to perform better than others if I remember correctly. The best of this GPU was the 16bit color palette that surpassed everything out there without any color problems. I remember that games like Thief were the best to see this advantage.

Reply 30 of 39, by candle_86

User metadata
Rank l33t
Rank
l33t
maximus wrote on 2020-04-16, 18:40:
leileilol wrote on 2020-04-16, 00:16:

Also a reminder of the Geforce2's blurry VGA output which goes against its practicality of using high resolutions 😀

There seems to be a lot of variability. I have a 3D Prophet II Ti and a couple of unbranded (probably Dell) GeForce2 GTS cards where the VGA output is perfectly bright and sharp. Lots of GeForce2 Ultras seem to have the wavy horizontal line problem, though, and I had an ASUS GeForce2 GTS at one point that I think was not great.

My visiontek GTS has perfect output, cranked to 1600x1200 an it looks fine so I guess mine is also fine

Reply 31 of 39, by 386SX

User metadata
Rank l33t
Rank
l33t
leileilol wrote on 2020-04-29, 00:18:

Kyro II's much lauded HSR (the alleged HWTnL equivalent) won't stop the CPU from getting through all that vertex work to render nothing.

Wasn't the EnT&L the T&L equivalent? The famous 4800 "mixed hardware/software" solution? (never understood if it was a real hw/sw solution or just maybe the usual Dx8 software T&L considering the option was enabled also in the latest 4500 drivers).

Reply 32 of 39, by auron

User metadata
Rank Oldbie
Rank
Oldbie

it was a bespoke software t&l for kyro that, at best, could allow certain games to run that would otherwise complain about missing t&l. but there were also reports about visual differences from using this and i don't think the reference software t&l in d3d was really slower.

more than anything it appears as marketing fluff to be able to claim to support a then popular checklist feature.

Reply 33 of 39, by 386SX

User metadata
Rank l33t
Rank
l33t
auron wrote on 2020-04-29, 14:55:

it was a bespoke software t&l for kyro that, at best, could allow certain games to run that would otherwise complain about missing t&l. but there were also reports about visual differences from using this and i don't think the reference software t&l in d3d was really slower.

more than anything it appears as marketing fluff to be able to claim to support a then popular checklist feature.

Interesting, it would be nice some low level info about it but probably the 4800 was so rare that just few were outside to try. It'd be nice to bench a 4500 with latest drivers and EnT&L with the 4800 with same config to see if the second is faster not only for the frequency of both core and memories but for the "hardware/software" mix. These are those myth like the Savage2000 T&L unit and why it was that bad or simply not used.

Reply 34 of 39, by auron

User metadata
Rank Oldbie
Rank
Oldbie

i think you've misunderstood something, 4800 was never supposed to be more than a 4500 with 25mhz extra on core and memory, hence the kyro ii se name. the ent&l feature was just touted around the same time so they could at least present something that sounded new.

now kyro iii would have been something else with actual hw t&l...

Reply 35 of 39, by 386SX

User metadata
Rank l33t
Rank
l33t
auron wrote on 2020-04-29, 20:03:

i think you've misunderstood something, 4800 was never supposed to be more than a 4500 with 25mhz extra on core and memory, hence the kyro ii se name. the ent&l feature was just touted around the same time so they could at least present something that sounded new.

now kyro iii would have been something else with actual hw t&l...

Maybe I didn't read correctly articles back then but I think to remember the EnT&L (someone called I don't remember a sort of "mixed hardware/software") was part of the specifications (beside how much real was that "mix") and expectations around the 4800. I remember I was waiting quite a lot this card having the 4500 and considering it a great alternative solution. Maybe at the end it was just a driver solution to somehow used the software T&L in a proprietary way or whatever, but I think it was an expected feature before real bench numbers came out and the card did not.

from archive.org: https://web.archive.org/web/20030203224217/ht … kpr.php3?pr=158

...3D Prophet 4800’s new 3D processor offers a remarkable assortment of the most up-to-date graphics technology features for key 3D effects in the latest applications. 3D Prophet 4800 stands out with its unmatched enhanced Transform And Lighting technology engine which has been specifically designed by PowerVR for the new generation of KYROII chipsets. It mixes intelligent 3D and CPU resources applied to Transform & Lighting and enables the 3D processing to calculate in real time how light intensity will vary on surfaces, where shadows are positioned in 3D applications. These operations guarantee more realistic scenes without increasing CPU resources overload, which maximizes the frame rates in games. The enhanced T&L; engine proves to be up to 15% faster than standard engines in games where T&L; causes a bottleneck. ...

But of course I imagine it probably was just a faster Kyro2 chip at its 200Mhz limits and the Kyro3 was the real gpu to expect (like most of us expected I think 😀).
What really surprised me in those reviews I remember was the Focus tv-out chip that I think was amazing for those time.

Last edited by 386SX on 2020-04-29, 20:29. Edited 1 time in total.

Reply 36 of 39, by auron

User metadata
Rank Oldbie
Rank
Oldbie

"mixes intelligent 3d and cpu resources" is the giveaway. they never even claimed it has hardware t&l, just that their new software solution is up to 15% faster than the reference d3d one.

if you still have any doubts left, check this: https://forum.beyond3d.com/threads/about-kyro … 00-ent-l.43195/

simon f in that thread is an imagination employee and confirms that the idea of a "mixed hw/sw t&l" on kyro is bogus.

Reply 37 of 39, by 386SX

User metadata
Rank l33t
Rank
l33t
auron wrote on 2020-04-29, 20:28:

"mixes intelligent 3d and cpu resources" is the giveaway. they never even claimed it has hardware t&l, just that their new software solution is up to 15% faster than the reference d3d one.

Maybe that "mix" word increased a lot expectation about this long waited feature in those times and anyway looks like it was "intelligent" if the speed up was 15% not bad compared to the usual Software T&L D3D implementation (even if calling it "engine" was a bit too much maybe?), after twenty years it's a miracle I stil remember that card! Anyway those were great years for hardware components competition discussions. 😁

This is another preview: https://web.archive.org/web/20031210215011/ht … ease.asp?ID=172

EnT&L;™
To deliver optimal cost-performance KYRO II SE supports the new EnT&L; driver technology which unites the advanced features of KYRO II SE with transform and lighting to provide superior performance to cost comparable solutions. EnT&L; is optimized for KYRO II SE AGP 4X SBA operation and Implicit Guard Band Clipping.

I remember I was waiting for this card so much. 🙁 😁

Reply 38 of 39, by leileilol

User metadata
Rank l33t++
Rank
l33t++

It's more of a buzzword for "2002 games will stop crashing with these newer outofthebox drivers" to me. The Geforce2 GTS definitely has it beat in that category and I feel it's not worth interjecting the Kyro2 in a Vs thread between two specific cards that both blow Kyro2 away anyway...

apsosig.png
long live PCem

Reply 39 of 39, by candle_86

User metadata
Rank l33t
Rank
l33t
leileilol wrote on 2020-04-30, 00:09:

It's more of a buzzword for "2002 games will stop crashing with these newer outofthebox drivers" to me. The Geforce2 GTS definitely has it beat in that category and I feel it's not worth interjecting the Kyro2 in a Vs thread between two specific cards that both blow Kyro2 away anyway...

Well I'm now more intrigued than ever I'm going to start hunting for dx7 cards to compare to each other just because