VOGONS


Geforce3 Ti 500 vs Geforce 6200

Topic actions

Reply 40 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t

Ok ok im going to run benchmarks on them you have my word. I'll get Ti 500 vs 6200 done tomorrow night, and the FX once it arrives.

Then we can decide between all 3. I do remember my FX 5900XT tearing it up in games i played minus CSS of course.

Maybe I can get my buddy to let me borrow most of my old AGP cards. Maybe

What he has

Geforce 2 MX 400
Geforce 2 Ultra
Geforce 6800GS
Radeon x800XT
Radeon 7500
Radeon 9650
Geforce 7600GS

Would make for an interesting comparision

Last edited by candle_86 on 2016-03-17, 19:27. Edited 1 time in total.

Reply 41 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Skyscraper wrote:
swaaye wrote:
Skyscraper wrote:

Well it's faster when it comes to everything that matters.

Oh ok. Thanks for your assistance with the obvious then.

You are welcome.

I suppose I could mention that I've championed the 5900U before you were even a member on here. Mainly for its Win9x D3D3-6 compatibility aspects and high resolution performance with games up to D3D7. I'm just particularly entertained with trying to run shader model 2 games on it and watching it puke.

Reply 42 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++
candle_86 wrote:

Ok ok im going to run benchmarks on them you have my word. I'll get Ti 500 vs 6200 done tomorrow night, and the FX once it arrives.

Then we can decide between all 3. I do remember my FX 5900XT tearing it up in games i played minus CSS of course.

CSS must run in D3D8 mode on it. Probably runs pretty well like HL2 does in that default mode.

Reply 43 of 96, by Skyscraper

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

I suppose I could mention that I've championed the 5900U before you were even a member on here. Mainly for its Win9x D3D3-6 compatibility aspects and high resolution performance with games up to D3D7.

I use a Geforce FX5900 Ultra in my Tualatin Windows 9x rig. I do not doubt that the Radeon 9600XT can be faster even when it comes to running some DX8 stuff but the Geforce 6200 shouldnt be, at least it wasnt when I benched both cards along with most other DX7, DX8 and early DX9 AGP cards.

Lets wait and see what candle_86s benchmarks show.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 44 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t
swaaye wrote:
candle_86 wrote:

Ok ok im going to run benchmarks on them you have my word. I'll get Ti 500 vs 6200 done tomorrow night, and the FX once it arrives.

Then we can decide between all 3. I do remember my FX 5900XT tearing it up in games i played minus CSS of course.

CSS must run in D3D8 mode on it. Probably runs pretty well like HL2 does in that default mode.

No i used the hack to force DX9 on GeforceFX it was terribly slow but at the time i wanted the DX9 🤣. But back then I also played these games and was quite happy.

Call of Duty
Medal of Honor Pacific Assault
Counter Strike Source
Half Life 2
Halo
Battlefield Vietnam
Star Trek Bridge Commander (wish I could Benchmark this)
Command and Conquer Generals
FarCry

What I do recall is in Generals, Bridge Commander, and Call of Duty i was faster than a friend running a Radeon 9800Pro (my FX 5900XT was overclocked to 450/1000 and god do i miss it, the Prolink PixelView so it was in reality running like a 5950 Ultra), but in Halo, MOH:PA, HL2, CSS, BF:V, and FarCry he would smoke me, and conditioned to do so when I got my 6200.

Reply 45 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t

Here is FarCry Very High taken from Tomshardware VGA Charts IV from 2004.

image016.gif

So yes a 9600 will beat an FX when running SM2 code, FarCry Very High is all SM2 Code though FarCry was never well optimized it shows more a worst case scenario

Reply 46 of 96, by Skyscraper

User metadata
Rank l33t
Rank
l33t
candle_86 wrote:

What I do recall is in Generals, Bridge Commander, and Call of Duty i was faster than a friend running a Radeon 9800Pro (my FX 5900XT was overclocked to 450/1000 and god do i miss it, the Prolink PixelView so it was in reality running like a 5950 Ultra), but in Halo, MOH:PA, HL2, CSS, BF:V, and FarCry he would smoke me, and conditioned to do so when I got my 6200.

With luck you will get 450+/1000 out of the Geforce FX 5900 Ultra.

Here is screenshot showing my cards 3dmark 2001 result running at 500/1000. This is with a rather slow K7 system running Windows 9x though so the score isnt great, your Barton will probably add a couple of thousand points.

XP2400plus_kudoz7e333a_FX5900Ultra_OC_3dmark2001.jpg
Filename
XP2400plus_kudoz7e333a_FX5900Ultra_OC_3dmark2001.jpg
File size
299.14 KiB
Views
1847 views
File license
Fair use/fair dealing exception
candle_86 wrote:

Here is FarCry Very High taken from Tomshardware VGA Charts IV from 2004.

So yes a 9600 will beat an FX when running SM2 code, FarCry Very High is all SM2 Code though FarCry was never well optimized it shows more a worst case scenario

That chart can not have made Nvidia very happy. 😀

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 47 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t
Skyscraper wrote:
With luck you will get 450+/1000 out of the Geforce FX 5900 Ultra. […]
Show full quote
candle_86 wrote:

What I do recall is in Generals, Bridge Commander, and Call of Duty i was faster than a friend running a Radeon 9800Pro (my FX 5900XT was overclocked to 450/1000 and god do i miss it, the Prolink PixelView so it was in reality running like a 5950 Ultra), but in Halo, MOH:PA, HL2, CSS, BF:V, and FarCry he would smoke me, and conditioned to do so when I got my 6200.

With luck you will get 450+/1000 out of the Geforce FX 5900 Ultra.

Here is screenshot showing my cards 3dmark 2001 result running at 500/1000. This is with a rather slow K7 system running Windows 9x though so the score isnt great, your Barton will probably add a couple of thousand points.

XP2400plus_kudoz7e333a_FX5900Ultra_OC_3dmark2001.jpg
candle_86 wrote:

Here is FarCry Very High taken from Tomshardware VGA Charts IV from 2004.

So yes a 9600 will beat an FX when running SM2 code, FarCry Very High is all SM2 Code though FarCry was never well optimized it shows more a worst case scenario

That chart can not have made Nvidia very happy. 😀

nope, I do however disreguard the Geforce4 in that test, it is running in SM1.1 code path

Reply 48 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Far Cry has a lot of different rendering paths it chooses depending on card. I think it has paths specifically for just about every generation of cards up to NV4x and R4x0. That seems like a lot of optimization effort to me!

Reply 49 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Far Cry has a lot of different rendering paths it chooses depending on card. I think it has paths specifically for just about every generation of cards up to NV4x and R4x0. That seems like a lot of optimization effort to me!

maybe but then why to run at 2560x1600 Very High with 8xaa and 16xAF did we have to wait for the GTX280 🤣.

Reply 50 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++
candle_86 wrote:
swaaye wrote:

Far Cry has a lot of different rendering paths it chooses depending on card. I think it has paths specifically for just about every generation of cards up to NV4x and R4x0. That seems like a lot of optimization effort to me!

maybe but then why to run at 2560x1600 Very High with 8xaa and 16xAF did we have to wait for the GTX280 🤣.

Probably because that requires a massive amount of pixel fillrate and memory bandwidth, which GTX 280 provides. 😁

Reply 52 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++

If you wanna see a game that needed more work, try Deus Ex Invisible War. I was playing around with it on AGP and PCIe setups about a year ago. If I remember right, 1600x1200 isn't very enjoyable until you have around a X800XT PE. 6800 Ultra chokes on it for some reason. The card has lower potential fillrate than a X800XT PE so maybe it's the stencil shadow fillrate requirements that are out of control. I think it's more demanding than Doom3 / Far Cry / Half Life 2 and the visuals don't match up to that!

Reply 53 of 96, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

I remember Far Cry getting a bit of a boost on GF6 cards with the SM3 patch.

But looking at old reviews you often see ATI dominating HL2 and Far Cry, whereas Nvidia dominated in Doom 3 and other OpenGL games.

YouTube, Facebook, Website

Reply 54 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++

NV35+ are designed with idTech4 in mind to some extent so yeah they do well. They have stencil shadowing performance features. I think Chronicles of Riddick uses these features as well.

Far Cry and Half Life 2 might benefit from ATI's higher pixel/texture fill rates particularly on X800XT PE.

Reply 55 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t

So I've ran some early benchmark comparisons just to test.

Test configuration
AMD Athlon XP 2800+
1gb DDR 400 2.5-3-3-7
80gb 7200Rpm Hard Drive
Geforce 3 Ti 500 240/500 64mb/128bit 81.94 Drivers
Geforce 6200 350/400 256mb/64bit 81.94 Drivers
Windows XP Professional SP3

3dMark 2000 1600x1200 32bit
Geforce 3 Ti 500 6183
Geforce 6200 3155
GeforceFX 5900

3dmark 2001SE 1024x768
Geforce 3 Ti 500 8364
Geforce 6200 8079
GeforceFX 5900

3dMark 2003 1024x768
Geforce 3 Ti 500 1460
Geforce 6200
GeforceFX 5900

Quake 3
Geforce 3 Ti 500
Geforce 6200
GeforceFX 5900

FarCry
Geforce 3 Ti 500
Geforce 6200
GeforceFX 5900

Doom 3
Geforce 3 Ti 500
Geforce 6200
GeforceFX 5900

X2
Geforce 3 Ti 500
Geforce 6200
GeforceFX 5900

Call of Duty
Geforce 3 Ti 500
Geforce 6200
GeforceFX 5900

Reply 57 of 96, by Skyscraper

User metadata
Rank l33t
Rank
l33t
dirkmirk wrote:

To be fair, the 64bit 6200 is a sh!tload slower than the 128bit model.

This is true, it seems Nvidia tried to offset this with a slightly higher GPU clock but this only helps at really low resolution.

The 64bit version is much more common, the 128bit version is a cut-down version of the Geforce 6600 and can sometimes be unlocked to a full Geforce 6600. I have both versions and if I remember right the 64bit version of the Geforce 6200 produced ~9000 points in 3dmark 2001 with the 128 bit version getting ~12000 points and the Geforce FX 5900 Ultra ~20000 points, all scores with a fast Athlon 64 system.

A little disclaimer, my memory is great but not infallible. I cant find any notes so they are probably lost or still on the HDD I used. I should build a new fast Windows XP AGP system and bench through my AGP cards. I do have an Athlon XP 2400+ AGP Windows 9x test system that I have used to bench and document many cards but it's too slow to really show what the cards can do.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 59 of 96, by Skyscraper

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Cards are sorted by ranking and the GF3 always wins? Weird. What resolution were the games tested at?

That's a 5900 Ultra?

Im pretty sure they are place holders with the last actual score beeing the GF3 Ti 500 3dmark 2003 score. I do not think he has got the FX5900 (Ultra?) yet.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.