VOGONS


Geforce3 Ti 500 vs Geforce 6200

Topic actions

Reply 60 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t

its a placeholder and its an FX 5900 non ultra im getting on monday.

Likely this will all go into bar graphs so i can make it look better 🤣

Also i don't want the GF3 to win everytime, honestly im hoping the FX 5900 smokes them both 🤣

Reply 61 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t

The FX5900 came in.

Revisied benchmark list

3dmark2000 10x7 and 12x10 32bit
3dmark2001 10x7 and 12x10
3dmark 03 10x7
FarCry Very High 10x7
Doom3 High 10x7
X2 10x7
AquaMark3

Running them all now

Reply 63 of 96, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie

I'm a little confused, what does 10x7 and 12x10 mean? 1024x768 and 1280x1024?

A little off topic, here's Radeon 9800Pro/AthlonXP 3200+ at 1024x768x32, shouldn't be much different from 2800+, AMD hit megahertz wall around 3000+ anyway. https://dl.dropboxusercontent.com/u/99196890/ … Dmark001024.PNG

Reply 64 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t

Full System Specifications

Athlon XP Barton 2800+ 2.083ghz
2x1gb Kingston DDR400 2.5-3-3-7
ABIT NF7-S2G
80gb Western Digital 7200 RPM IDE
Windows XP SP3

Geforce 3 T 500 240/500 128bit/64mb
Geforce FX 5900 400/850 256bit/128mb
Geforce 6200 350/400 64bit/256mb

All cards where tested with Nvidia Forceware 81.94

FarCry was patched to 1.33

3dmar2000 10x7.jpg
Filename
3dmar2000 10x7.jpg
File size
27.56 KiB
Views
2301 views
File license
Fair use/fair dealing exception
3dmar2000 12x10.jpg
Filename
3dmar2000 12x10.jpg
File size
27.3 KiB
Views
2301 views
File license
Fair use/fair dealing exception
3dmar2001 10x7.jpg
Filename
3dmar2001 10x7.jpg
File size
27.4 KiB
Views
2301 views
File license
Fair use/fair dealing exception
3dmar2001 12x10.jpg
Filename
3dmar2001 12x10.jpg
File size
25.59 KiB
Views
2301 views
File license
Fair use/fair dealing exception
3dmark03.jpg
Filename
3dmark03.jpg
File size
23.98 KiB
Views
2301 views
File license
Fair use/fair dealing exception
Last edited by candle_86 on 2016-03-19, 15:58. Edited 1 time in total.

Reply 65 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t
aquamark.jpg
Filename
aquamark.jpg
File size
25.8 KiB
Views
2301 views
File license
Fair use/fair dealing exception
FarCry.jpg
Filename
FarCry.jpg
File size
25.17 KiB
Views
2301 views
File license
Fair use/fair dealing exception
Doom3.jpg
Filename
Doom3.jpg
File size
22.82 KiB
Views
2301 views
File license
Fair use/fair dealing exception
x2.jpg
Filename
x2.jpg
File size
25.28 KiB
Views
2301 views
File license
Fair use/fair dealing exception

And here we go

Reply 66 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t

So a Ti 500 and a 6200 64bit are almost equals with the Ti 500 being slightly faster. The FX 5900 is faster than either option though. I do wish i had a 128bit 6200 to test as I think the 64bit bus is holding it back with its anemic 3.2gbs of bandwith.

Reply 68 of 96, by RacoonRider

User metadata
Rank Oldbie
Rank
Oldbie

Thanks, candle_86!

Now we can once again in this thread state that Ti500 beats 6200 in older stuff, and for the stuff where 6200 starts to catch up, they are both too slow anyway.

Reply 69 of 96, by Skyscraper

User metadata
Rank l33t
Rank
l33t
RacoonRider wrote:

Thanks, candle_86!

Now we can once again in this thread state that Ti500 beats 6200 in older stuff, and for the stuff where 6200 starts to catch up, they are both too slow anyway.

You can try. 😉

I will at least keep insisting that a Geforce FX 5900 Ultra is faster than a Geforce 6200 (128bit) in everything that matters but Im not sure I will get away with my statement either.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 70 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Remember that the 6200 is rendering more complex effects than the GF3 in some of those tests. Can't give the GF3 too much credit.

There may also be shader replacement cheating going on with the FX card in any programs from the card's heyday that use PS2.0. Some of that stuff actually reduced image quality. But it might be a moot point because you probably won't be able to tell without side by side comparison.

Reply 71 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Remember that the 6200 is rendering more complex effects than the GF3 in some of those tests. Can't give the GF3 too much credit.

There may also be shader replacement cheating going on with the FX card in any programs from the card's heyday that use PS2.0. Some of that stuff actually reduced image quality. But it might be a moot point because you probably won't be able to tell without side by side comparison.

yea likely true but at the same time its a real world test so its good to know

Reply 72 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++
candle_86 wrote:
swaaye wrote:

Remember that the 6200 is rendering more complex effects than the GF3 in some of those tests. Can't give the GF3 too much credit.

There may also be shader replacement cheating going on with the FX card in any programs from the card's heyday that use PS2.0. Some of that stuff actually reduced image quality. But it might be a moot point because you probably won't be able to tell without side by side comparison.

yea likely true but at the same time its a real world test so its good to know

You could force the 6200 and 5900 to render the NV20 path in Doom3 and compare directly . 😁 That would be interesting.

Reply 73 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t
swaaye wrote:
candle_86 wrote:
swaaye wrote:

Remember that the 6200 is rendering more complex effects than the GF3 in some of those tests. Can't give the GF3 too much credit.

There may also be shader replacement cheating going on with the FX card in any programs from the card's heyday that use PS2.0. Some of that stuff actually reduced image quality. But it might be a moot point because you probably won't be able to tell without side by side comparison.

yea likely true but at the same time its a real world test so its good to know

You could force the 6200 and 5900 to render the NV20 path in Doom3 and compare directly . 😁 That would be interesting.

yes maybe, but why do that. the FX series never had any issues with OpenGL, not even with SM 1 just with SM2

Reply 74 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t

So card's im gonna try to find and add to my data, i did afterall save the spread sheet

Geforce Cards to add:
Geforce4 MX 440
Geforce 4 Ti 4200
Geforce 4 Ti 4400
Geforce 4 Ti 4600/4800
Geforce FX 5200/5200 Ultra
Geforce FX 5600/5600 Ultra
Geforce FX 5700/5700 Ultra
Geforce FX 5800 Ultra (long shot but I want one)
Geforce FX 5950 Ultra
Geforce 6200 128bit
Geforce 6600
Geforce 6600GT
Geforce 6800
Geforce 6800GT/6800 Ultra

ATI cards I want to compare with
Radeon 7500
Radeon 8500 Le
Radeon 8500
Radeon 9000Pro/9200Pro
Radeon 9500 Pro
Radeon 9600 Pro
Radeon 9700 Pro
Radeon 9800 Pro/XT
Radeon x700Pro AGP
Radeon x800 GTO AGP
Radeon x800 XT AGP

Now to try to tack them down or an affordable price, and pray for an FX 5800 Ultra to simply show up 🤣.

Reply 75 of 96, by swaaye

User metadata
Rank l33t++
Rank
l33t++
candle_86 wrote:

yes maybe, but why do that. the FX series never had any issues with OpenGL, not even with SM 1 just with SM2

Because then you can see how much faster a 5900 and 6200 are at doing exactly what the GeForce 3 is doing. You could even perhaps think of it as another D3D8-ish test then. You can also see how much the fancier effects are slowing the 5900 and 6200 down.

Think about it this way - should you even care about the frame rate when the game doesn't look the same on all of the cards? X2, Far Cry, and Doom3 do not look nearly as good on a GeForce 3 as they do on a FX 5900 or 6200. It's not unlike how NV cheated with the FX cards and made games look worse to get a better frame rate, except the GF3 is even uglier than that cheating was.

Reply 76 of 96, by PhilsComputerLab

User metadata
Rank l33t++
Rank
l33t++

That is a good point, and I didn't think of that when I recommended Far Cry 😊

I remember having this discussion when my brother told me Far Cry runs fine on his GeForce4 Ti. But obviously it wasn't as nice looking as on a Radeon 9700.

Even late cards like the X800 series do not have SM3, though I don't know if this affects looks, performance or both.

YouTube, Facebook, Website

Reply 77 of 96, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

New'ish games (like Crysis and Bioshock 1), can have problems with GPU's without SM3.0 (mainly graphic glitches).
They can be "fixed", but may pose problems further down the line (performance).

But who wants to play Crysis on X850 XT AGP 😉

5900 Ultra will be faster when Fillrates are main concern (DX6/DX7/DX8 games).
Main reason is that it has memory bandwidth to back up ROP's Pixel Fillrate AND two TMU's per pixel pipe (this is what makes it a "4x2" [Pixel x TMU] design, compared to 9700 Pro's "8x1").

157143230295.png

Reply 78 of 96, by candle_86

User metadata
Rank l33t
Rank
l33t
PhilsComputerLab wrote:

That is a good point, and I didn't think of that when I recommended Far Cry 😊

I remember having this discussion when my brother told me Far Cry runs fine on his GeForce4 Ti. But obviously it wasn't as nice looking as on a Radeon 9700.

Even late cards like the X800 series do not have SM3, though I don't know if this affects looks, performance or both.

Well honestly running FarCry on my Ti 500 vs my FX 5900 Ultra which I verified the TI 500 runs SM 1.1 Path while the FX 5900 runs the SM2 path on very high. They look almost indistinguishable, the water is a little more clear on SM2 but other than the water they look identical.

Reply 79 of 96, by Skalabala

User metadata
Rank Member
Rank
Member
candle_86 wrote:
PhilsComputerLab wrote:

That is a good point, and I didn't think of that when I recommended Far Cry 😊

I remember having this discussion when my brother told me Far Cry runs fine on his GeForce4 Ti. But obviously it wasn't as nice looking as on a Radeon 9700.

Even late cards like the X800 series do not have SM3, though I don't know if this affects looks, performance or both.

Well honestly running FarCry on my Ti 500 vs my FX 5900 Ultra which I verified the TI 500 runs SM 1.1 Path while the FX 5900 runs the SM2 path on very high. They look almost indistinguishable, the water is a little more clear on SM2 but other than the water they look identical.

You don't want to sell me the Ti500? Please? Pretty please? 😵