VOGONS


First post, by rick6

User metadata
Rank Member
Rank
Member

Back in 2002 i had a Athlon XP 2000+ with a Chaintech Geforce 3 ti200 128mb overclocked above the clocks of a Geforce 3 ti500 and i always wondered if i ever lost anything by not aquiring the Geforce 4 ti4200.
I know that most will say Geforce 4 ti4200 all the way, but would you notice the difference on the given system with the games of the time (and maybe 2003/2004)?

Judging by this article from tomshardware http://www.tomshardware.com/reviews/big-sister,448-4.html there could be a 5 to 10 fps of difference from a ti500 to a ti4200 and that's not THAT much.

Would you say that a ti500 is about 90% of a ti4200?

My 2001 gaming beast in all it's "Pentium 4 Williamate" Glory!

Reply 1 of 6, by Skyscraper

User metadata
Rank l33t
Rank
l33t

No.

A Geforce 4 ti 4200 totally destroys a Geforce 3 ti 500.
But you need a faster system than the typical Windows 98 box and high resolution to really see the difference.

Perhaps the 128MB version of the ti 4200 suffers from its slow default memory clock.
The 64MB version is way faster than a Geforce 3 though in just about everything I have benched.

edit

I checked some notes and the difference seem to be closer to 20% in newer DX8 games and 10 - 15% in older DX7 and Open GL games.
This is with the GF4 ti 4200 64MB vs GF3 vanilla 64MB @ ti 500 clocks.

The thing is the ti 4200 can handle much much higher clocks than the stock clocks while the ti 500 is close to its limits out of the box.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 2 of 6, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Geforce 4 is simply an improved Geforce 3. Somewhat faster per clock tick and clocks higher too. It also has efficiency improvements over the NV2A in Xbox. But practically there really isn't a big difference. GF3 and 4 have essentially the same features and are really in the same class.

From a big picture standpoint, Geforce 256 - Geforce 4 are very similar chips. The same pipeline layout but with gradual enhancements to programmability and efficiency. Geforce FX is a new architecture and that architecture improves through Geforce 7. And then Geforce 8 is the next fresh architecture.

Reply 3 of 6, by Skyscraper

User metadata
Rank l33t
Rank
l33t

I agree that the difference wasnt enough to warrent an upgarde back in year 2002.
But if I was looking for a "new" card today to be used in a really fast DX7/DX8 build I would rather go with Geforce 4 ti 4*00 than a Geforce 3.
Geforce 4 ti 4200 scales very well with overclocking and can (with some luck) reach GF4 ti 4600 speeds which is far beyond what you can get out of a GF3.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 4 of 6, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I prefer to use 5900 Ultra because ideally I want to run 1600x1200 or 1920x1200 and sometimes added 2x Quincunx is nice too.

Though a GeForce 4 Ti 4600 should be ok with many games at 1600x1200. GeForce 3 can manage 1600x1200 pretty well in Quake 2-ish games unless there is a lot of effects layering or transparent textures causing a jump in overdraw.

What's amazing is how much faster a 200 MHz GeForce 3 is than a 250 MHz GeForce 2 Ti at 1600x1200. I saw at least doubled frame rate with Daikatana. GeForce 3 is so much more efficient.

Reply 5 of 6, by Skyscraper

User metadata
Rank l33t
Rank
l33t

I do also use the FX 5900 Ultra in my Tualatin build.
I usually play at 1024*768 or 1280*1024/960 but I like headroom.

New PC: i9 12900K @5GHz all cores @1.2v. MSI PRO Z690-A. 32GB DDR4 3600 CL14. 3070Ti.
Old PC: Dual Xeon X5690@4.6GHz, EVGA SR-2, 48GB DDR3R@2000MHz, Intel X25-M. GTX 980ti.
Older PC: K6-3+ 400@600MHz, PC-Chips M577, 256MB SDRAM, AWE64, Voodoo Banshee.

Reply 6 of 6, by rick6

User metadata
Rank Member
Rank
Member
Skyscraper wrote:

The thing is the ti 4200 can handle much much higher clocks than the stock clocks while the ti 500 is close to its limits out of the box.

Yes true, and i've though of that. My burning question was actually if a Geforce 3 ti500 is able to keep even if slightly with a stock ti4200. Well by reading your awnsers it really depends. The most obvious awnser is no, at least not in raw power but it doesn't get all that much behind also.

I remember reading a while ago that the main difference from a Geforce 3 to a Geforce 4 apart from clocks was that the Geforce 3 gpu supported Pixel Shader v1.0, and the Geforce 4 gpu supported Pixel Shader v1.1 which is almost the same.
So everything the Geforce 4 could run, so could the Geforce 3 even if not as fast. I guess that's when the FX series came into the scene things changed a bit with the support of Pixel Shader 2.0. I believe that's why the Geforce 3 and 4 couldn't run most of the 3D Mark 2003 tests at the time, even though the Geforce 4 ti 4600 had the same or even more raw power than most of the FX series lineup, making those 3D Mark 2003 scores quite unfair to be honest.

My 2001 gaming beast in all it's "Pentium 4 Williamate" Glory!