VOGONS


GeForce4 vs. GeForce FX?

Topic actions

Reply 60 of 70, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
RandomStranger wrote on 2021-09-03, 05:54:

@o'Doyle
https://www.youtube.com/watch?v=XVO3NJCPIoY

The numbers speak for themself.

System:
Core2 E5800 @ 3.2Ghz
2GB DDR @ 400Mhz Dual Channel
Driver 45.23 for nVidia
Driver 10.2 for ATI

Benchmark:
-Quake 3 version 1.32c
-DEMO001
-1280x1024 resolution
-All settings maxxed
-For 16-bit tests, color depth and texture quality were changed to 16-bit
-Test run 3 times

Radeon 9700 (Dell OEM)
32-bit
170.7
170.8
170.7

16-bit
173.1
173.3
173.3

GeForce FX5800 (Quadro 1000 modified with Rivatuner bootstrap method and clocked to 400/800 with Coolbits)
32-bit
245.4
245.4
245.4

16-bit
290.6
294.1
293.9

Scale these numbers down to the early 2000s, with sub-Gigahertz machines -- why the heck should someone choose to pay the premium for the ATI product just for DX9 support? The whole selling point was HL2, which had its source code hacked and released way past launch date anyhow. You could get a LOT more longevity out of your system with an FX card.

The Radeon 9700 has a weak VRM compared to the 5800. The 5800 has good quality, expensive polymer caps that are still good. The 9700 had Nichicon HC series (decent, but not adequate for a modern VGA VRM) which I already replaced years ago before storing the card.

Even if the 9700 was on-par performance-wise with the 5800, look at that massive performance gain achieved by going to 16-bit. And the 9700 had weird graphical glitches in Quake3 at 16-bit, though the benchmark did complete without issue.

Now one of you mavens please explain to me again how the FX generation of cards was a disappointment, and how great ATI was with their "revolutionary" 9700 series.

7ivtic.png

Reply 61 of 70, by bloodem

User metadata
Rank Oldbie
Rank
Oldbie

TL;DR
FYI, I got 256 FPS with a Radeon 9700 PRO in Quake 3 @ 1280 x 1024 (all settings maxed out). Granted, the 9700 PRO is faster than the 9700, but definitely not 50% faster. 😀
I'm guessing it's because you are using the much too new Catalyst 10.2 driver (released in 2010), which was optimized for newer, more modern titles.
On the other hand, for the nVIDIA card you are using a more period-correct driver from 2003.

2 x PGA132 / 5 x Socket 3 / 9 x Socket 7 / 8 x SS7 / 12 x Socket 8 / 11 x Slot 1 / 3 x Slot A
5 x Socket 370 / 8 x Socket A / 2 x Socket 478 / 2 x Socket 754 / 3 x Socket 939 / 4 x LGA775 / 1 x LGA1155
Current rig: Ryzen 5 3600X
Backup rig: Core i7 7700k

Reply 62 of 70, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

Yeah, I can't still believe that nVidia was capable of going that far with cheating, especially with such reputable benchmark.

Actually, they've both cheated here and there. In 3DMark2001, certain driver versions from Nvidia and ATi simplify tree leaves animation. If you look closely, they have jerky movement.

Funnily enough, Nvidia driver, which was shipped with Windows XP SP3, is one of those "optimized" ones.

Get up, come on get down with the sickness
Open up your hate, and let it flow into me

Reply 63 of 70, by leileilol

User metadata
Rank l33t++
Rank
l33t++

The "nvidia favoring" Q3 is only due to r_primitives 0's buggy extension detecting behavior (done so because nvidia did CVAs badly in earlier Detonators 🤣). Set that to 1 on ATI. (....and later nvidia cards 🤣)

apsosig.png
long live PCem

Reply 64 of 70, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
bloodem wrote on 2021-09-03, 18:00:
TL;DR FYI, I got 256 FPS with a Radeon 9700 PRO in Quake 3 @ 1280 x 1024 (all settings maxed out). Granted, the 9700 PRO is fast […]
Show full quote

TL;DR
FYI, I got 256 FPS with a Radeon 9700 PRO in Quake 3 @ 1280 x 1024 (all settings maxed out). Granted, the 9700 PRO is faster than the 9700, but definitely not 50% faster. 😀
I'm guessing it's because you are using the much too new Catalyst 10.2 driver (released in 2010), which was optimized for newer, more modern titles.
On the other hand, for the nVIDIA card you are using a more period-correct driver from 2003.

That sounds about right. Keep in mind that my "9700 TX" as Wikipedia refers to it only has a 263/263 clock, whereas your fully-fledged 9700Pro is 325/310.

Having said that, I just re-ran my test with Catalyst 7.7 and I got the following results:

183.8
184
184.7

So that's approximately only 15fps of an increase with the period correct driver.

Also, my card is dying. I just noticed the artifacting in the Windows XP loading screen, and when I ran the benchmark, Quake had major artifacting (though it did complete without issue). Either it's dying or it doesn't like this much newer system, I should put it in my old BX motherboard to verify.

7ivtic.png

Reply 65 of 70, by cyclone3d

User metadata
Rank l33t++
Rank
l33t++
mockingbird wrote on 2021-09-03, 21:22:
That sounds about right. Keep in mind that my "9700 TX" as Wikipedia refers to it only has a 263/263 clock, whereas your fully- […]
Show full quote
bloodem wrote on 2021-09-03, 18:00:
TL;DR FYI, I got 256 FPS with a Radeon 9700 PRO in Quake 3 @ 1280 x 1024 (all settings maxed out). Granted, the 9700 PRO is fast […]
Show full quote

TL;DR
FYI, I got 256 FPS with a Radeon 9700 PRO in Quake 3 @ 1280 x 1024 (all settings maxed out). Granted, the 9700 PRO is faster than the 9700, but definitely not 50% faster. 😀
I'm guessing it's because you are using the much too new Catalyst 10.2 driver (released in 2010), which was optimized for newer, more modern titles.
On the other hand, for the nVIDIA card you are using a more period-correct driver from 2003.

That sounds about right. Keep in mind that my "9700 TX" as Wikipedia refers to it only has a 263/263 clock, whereas your fully-fledged 9700Pro is 325/310.

Having said that, I just re-ran my test with Catalyst 7.7 and I got the following results:

183.8
184
184.7

So that's approximately only 15fps of an increase with the period correct driver.

Also, my card is dying. I just noticed the artifacting in the Windows XP loading screen, and when I ran the benchmark, Quake had major artifacting (though it did complete without issue). Either it's dying or it doesn't like this much newer system, I should put it in my old BX motherboard to verify.

It may behave perfectly well on an older system as it will be CPU limited on an older system and thus not stressing the card as much.

Better cooling may help it on the newer system.

Yamaha YMF modified setupds and drivers
Yamaha XG resource repository - updated November 27, 2018
Yamaha YMF7x4 Guide
AW744L II - YMF744 - AOpen Cobra Sound Card - Install SB-Link Header

Reply 66 of 70, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie
vorob wrote on 2021-09-02, 06:03:

It’s a problem of nglide on fx or on modern video cards also? Cause I though Zeus fixed absolutely everything in glide emulation 😀

I don't have that answer. It was from the results from testing QEMU Glide pass-through with nGlide on modern video cards. Since I didn't have the issue with dgVoodoo2 Glide, I guessed the pass-through was OK. And, same issue on OpenGlide and Zeck GlideWrapper, otherwise OpenGlide would have been great for Undying. Perhaps the previous poster who played Undying with nGlide on GeForce4 or FX can check it out for us. It should be quick & easy.

BitWrangler wrote on 2021-09-02, 02:13:

Are rearview mirrors working in driving games? NFS series etc

The rearview mirrors in NFS series (2SE, 3 Hot Pursuit, 4 High Stake & 5 Porsche) aren't using the same rendering technics I believe. Even OpenGLide & Zeck GlideWrapper work without any issue.

Reply 67 of 70, by swaaye

User metadata
Rank Moderator
Rank
Moderator

I feel like chiming in!! 😁

I like the R300 cards for the better 16-bit dithering quality, and their superior anti-aliasing and anisotropic filtering. They had NV beaten with AA & AF until G80 came out. Well, unless you like to mess with the hidden NV SSAA modes. And of course if you want to play Shader Model 2 D3D games the FX cards aren't very appealing.

For most OpenGL games and old game compatibility in general, it's NVidia. Zeckensack's Glide wrapper (OpenGL) works best on NV. DGVoodoo1 (D3D 9) works rather nicely with ATI R300+. I don't really worry much about where they all fit in from a performance standpoint aside from the FX cards being useless for most PS2.0 games. From the GF4 vs FX standpoint, I gravitate to the FX cards because they tend to work just as well as a GF4 for things and also have the D3D 9 hardware curiosities to play with.

Oh and the pre-R300 Radeons have even more interesting 16-bit dithering, and so do the R500 Radeons.

Reply 68 of 70, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Yeah, pre-R300 Radeons had the option to have some FloydSteinberg-like dithering as an alternate option to the usual, vertical kernel ordered dithering. It can bring some filmgrain like tastes.

nVidia's 16x16 dithering matrix is well... what the hell is this supposed to be? a Restore Down button? (snapped from my FX5200, and it's been the same since Riva128)

Attachments

  • nvdither.png
    Filename
    nvdither.png
    File size
    442 Bytes
    Views
    182 views
    File license
    Fair use/fair dealing exception

apsosig.png
long live PCem

Reply 69 of 70, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

and anisotropic filtering

Yes and no. While R300 has more samples for anisotropic filtering, it's noticeably worse under certain angles. FX series don't have such problem.

Get up, come on get down with the sickness
Open up your hate, and let it flow into me

Reply 70 of 70, by swaaye

User metadata
Rank Moderator
Rank
Moderator
leileilol wrote on 2021-09-04, 05:27:

Yeah, pre-R300 Radeons had the option to have some FloydSteinberg-like dithering as an alternate option to the usual, vertical kernel ordered dithering. It can bring some filmgrain like tastes.

nVidia's 16x16 dithering matrix is well... what the hell is this supposed to be? a Restore Down button? (snapped from my FX5200, and it's been the same since Riva128)

Yeah, NVidia's dithering seems to be the bare minimum to get the job done. A sort of static dithering effect that can cause a rather muddy image sometimes.

The Serpent Rider wrote on 2021-09-04, 05:49:

Yes and no. While R300 has more samples for anisotropic filtering, it's noticeably worse under certain angles. FX series don't have such problem.

Yeah that's true. With the GF6 and 7 I think NV's optimizations cause more problems. If you set them to High Quality mode they look better though. I remember Oblivion needing this to fix the snow appearance.

With the FX cards it will also depend on the driver release because NV kept trying to get more speed with filtering optimizations. From what I gather the earliest drivers are the best quality.

The Radeon X1000 series have that awesome HQ AF mode though. From what I've seen it is superior to many of their Radeon HD cards (less aliasing) and similar to G80+.