VOGONS


GeForce 4 vs. GeForce FX?

Topic actions

Reply 120 of 217, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
mockingbird wrote on 2022-01-28, 21:10:

How did you accomplish that, with ATI Tool or registry hacks?

I used the Rage3D Tweak utility with the following settings: WFog = disabled, ZFog = enabled, TableFog = enabled.

That was the only combination of settings that produced table fog on my Radeon 9000 Pro under Win98.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 121 of 217, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t

I was using a GeForce 4 Ti4200 for a couple of years now, and have recently switched to a GeForce FX 5900XT, so I thought I'd share some of my experiences.

The FX card provides slightly better image quality over VGA. This is most noticeable when playing older games which run at a fixed resolution of 640x480 on a 19" LCD monitor. In contrast, when using DVI and playing newer games which can run at the monitor's native resolution, both cards deliver the same image quality.

There are some issues with texel alignment on the FX card which don't occur on the GeForce 4. This results in corrupted text and other minor graphical glitches in certain games.

Using Anti Aliasing and Anisotropic Filtering has less of an impact on frame rate when using the FX card. Obviously, it is much faster than the GeForce 4, but even taking that into account, it seems to handle AA and AF in a more optimized way.

Some games like Blood 2 and Shogo don't run well with 45.23 drivers on the FX card, and need a newer driver version to display the menu screens correctly. There are no such issues with the GeForce 4 card when using the same 45.23 driver version.

TL;DR
You get slightly lower compatibility in exchange for better image quality and higher performance when upgrading from a GeForce 4 to an FX card.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 122 of 217, by Sombrero

User metadata
Rank Oldbie
Rank
Oldbie
Joseph_Joestar wrote on 2022-12-15, 13:19:

TL;DR
You get slightly lower compatibility in exchange for better image quality and higher performance when upgrading from a GeForce 4 to an FX card.

NVIDIA had some image quality issues with their earlier cards, while they evidently improved that after GeForce2 could the image quality still vary between cards with GeForce4 but in lesser degree?

The only FX card I've used is 5200 and its VGA output isn't great, though I can't compare it to anything else than Voodoo3. I'd hope GeForce4/FX cards in general have better IQ than the card I have.

Reply 123 of 217, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Sombrero wrote on 2022-12-15, 13:42:

NVIDIA had some image quality issues with their earlier cards, while they evidently improved that after GeForce2 could the image quality still vary between cards with GeForce4 but in lesser degree?

The only FX card I've used is 5200 and its VGA output isn't great, though I can't compare it to anything else than Voodoo3. I'd hope GeForce4/FX cards in general have better IQ than the card I have.

It might be due to manufacturer differences. My GeForce 4 Ti4200 is made by Gainward while the GeForce FX 5900XT is from Leadtek.

However, I remember reading somewhere that the FX line of cards was when Nvidia started actually caring about their VGA image quality. Maybe someone else can clarify that further.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 124 of 217, by Gmlb256

User metadata
Rank l33t
Rank
l33t
Sombrero wrote on 2022-12-15, 13:42:
Joseph_Joestar wrote on 2022-12-15, 13:19:

TL;DR
You get slightly lower compatibility in exchange for better image quality and higher performance when upgrading from a GeForce 4 to an FX card.

NVIDIA had some image quality issues with their earlier cards, while they evidently improved that after GeForce2 could the image quality still vary between cards with GeForce4 but in lesser degree?

The only FX card I've used is 5200 and its VGA output isn't great, though I can't compare it to anything else than Voodoo3. I'd hope GeForce4/FX cards in general have better IQ than the card I have.

The image quality issues weren't nVidia's fault though, it also happened with other brands. Most manufacturers did a lot of mess with the components around the RAMDAC and this problem didn't completely disappear until they were fully integrated into the GPU chip.

VIA C3 Nehemiah 1.2A @ 1.46 GHz | ASUS P2-99 | 256 MB PC133 SDRAM | GeForce3 Ti 200 64 MB | Voodoo2 12 MB | SBLive! | AWE64 | SBPro2 | GUS

Reply 125 of 217, by Kahenraz

User metadata
Rank l33t
Rank
l33t

Output quality problems tend to be much more noticable on modern LCDs, when they may have gone unnoticed or invisibility on CRT monitors. I don't think it's fair to compare the output quality of later generation cards when DVI models were also available.

For comparison, I have found VGA issues that appear when viewed on an LCD across Matrox, 3dfx, Nvidia, and ATI, without consistency.

Anecdotally however, I have generally found the VGA output on FX series cards to be quite good when viewed on an LCD. Come to think of it, I'm having a hard time thinking of any.

Reply 126 of 217, by swaaye

User metadata
Rank l33t++
Rank
l33t++

That could also be the LCD monitor's fault though. That VGA input is just some more el cheapo circuitry and a ADC in front of the LCD's digital interface. VGA is a really unfortunate interface for an LCD.

Reply 127 of 217, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
swaaye wrote on 2022-12-15, 18:54:

That could also be the LCD monitor's fault though. That VGA input is just some more el cheapo circuitry and a ADC in front of the LCD's digital interface. VGA is a really unfortunate interface for an LCD.

Interesting. Does this mainly apply to (relatively) modern LCD monitors which still have VGA input, or is the same true for period correct LCD monitors from the mid 2000s as well?

And while I agree that this does play a role, it doesn't fully explain how the same monitor using the same VGA cable can produce a cleaner image with an FX card compared to a GeForce 4.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 128 of 217, by swaaye

User metadata
Rank l33t++
Rank
l33t++
Joseph_Joestar wrote on 2022-12-15, 20:52:

Interesting. Does this mainly apply to (relatively) modern LCD monitors which still have VGA input, or is the same true for period correct LCD monitors from the mid 2000s as well?

All LCDs are natively digital. They have/had VGA because PCs always had VGA output but DVI/HDMI was rare (on budget boxes) until fairly recently. LCDs with only VGA are just cost cutting because DVI is a luxury.

Notebooks use a digital interface for the LCD.

There are also CRTs with DVI. This moves DAC duties off the video card and into the monitor itself which may or may not be better.

Reply 129 of 217, by Kahenraz

User metadata
Rank l33t
Rank
l33t
Joseph_Joestar wrote on 2022-12-15, 20:52:

Interesting. Does this mainly apply to (relatively) modern LCD monitors which still have VGA input, or is the same true for period correct LCD monitors from the mid 2000s as well?

There are older MultiSync monitors that, I think, sync to a wider range of video modes than modern LCDs. This is typically only an issue with DOS games and some oddball digital outputs.

Joseph_Joestar wrote on 2022-12-15, 20:52:

And while I agree that this does play a role, it does't fully explain how the same monitor using the same VGA cable can produce a cleaner image with an FX card compared to a GeForce 4.

How large is your sample size for comparison? A single GeForce 4?

Reply 130 of 217, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Kahenraz wrote on 2022-12-16, 01:33:

How large is your sample size for comparison? A single GeForce 4?

I've tried several graphics cards on that monitor over time, and the image quality results are as follows:

Matrox G400 > GeForce FX 5900XT = ATi Radeon 9550 > Voodoo 3 > Matrox Millennium II > GeForce 4 Ti4200 > GeForce 4 MX440 > GeForce 2 MX400 > TNT2 M64 > S3 Virge DX.

The Matrox G400 provides the best image quality over VGA, followed closely by the GeForce FX 5900XT and the Radeon 9550 which are roughly the same. The Voodoo 3 and the Millennium II are also very good. After that, image quality begins to decline progressively.

For reference, the monitor is an 19" LG Flatron L1953HR.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 131 of 217, by Warlord

User metadata
Rank l33t
Rank
l33t

cables do matter, so does monitors. id give voodoo 3 a higher score. its right up there with matrox. One Problem FX5900XT has though atleast on my card is DVI crashes Command prompt under 98se with 45.23 drivers and blacks the whole screen out to the point the only fix is a hard reset.. VGA works fine though. I think that problems fixed in 5x.xx drivers though.

Reply 132 of 217, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Warlord wrote on 2022-12-16, 06:31:

cables do matter, so does monitors. id give voodoo 3 a higher score. its right up there with matrox.

The image quality differences between the first five cards on my list are minuscule. Matrox has a slight edge at higher resolutions like 1280x1024 but in anything lower, the output of all five cards is very sharp and fairly similar. Minor image quality degradation starts with my GeForce 4 Ti4200 and gets worse from there.

One Problem FX5900XT has though atleast on my card is DVI crashes Command prompt under 98se with 45.23 drivers and blacks the whole screen out to the point the only fix is a hard reset.. VGA works fine though. I think that problems fixed in 5x.xx drivers though.

I've experienced this sometimes as well, but I can't reproduce it reliably. Meaning, it happens rarely and randomly for me.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 133 of 217, by dr.zeissler

User metadata
Rank l33t
Rank
l33t

I recommend Quadro FX 700. Because it's a passive cooled card, has DX9 capabilities with 128MB@128Bit memory interface. Excellent card for a Shuttle-XPC.

Retro-Gamer 😀 ...on different machines

Reply 135 of 217, by NostalgicAslinger

User metadata
Rank Member
Rank
Member

I prefer the GeForce 4 Ti-4200 TSOP DDR memory version, but the 128MB version, not the 64MB one. All 128MB Ti4200 versions have a lower memory clock of 444MHz, so you need to clock to 513MHz, to reach the 64MB 4Ti-4200 versions, 4ns memory is rated for 500Mhz, so this clock is not a problem, because all GeForce4-Ti 4200 128MB cards should have this 4ns DDR memory chips. I also have seen cards with 3.8ns, but must use 4ns.

No One Lives Forever 2 with maximum quality settings for example has better frame times with the 128MB version. I use a Asus 4Ti-4200 128MB on my Penium III 1 GHz@1,1GHz coppermine system with the driver version 30.82 on 98 SE. Now playing No One Lives Forever 2, a nice DX 8.1 game with this card in 1280x1024x32 resolution. Back in 2002, I also have played this game on a 4200 128MB card, but on a Athlon XP 1800+ Socket A machine. A fast PIII Coppermine also has no problem with this game with the highest details settings. Only the charakter shadow is on medium.

I also have a Asus GF4 Ti4600, only for benchmarking. I have read somewhere on the internet, that earlier memory chips in BGA form (in special Samsung) could faster die than the older TSOP packages? The Asus 4600 has 2.8ns Samsung BGA memory. Runs fine with 700MHz memory clock, not faster tested. The GeForce 4 4400/4600 series, also MX460, are the first NVidia cards with BGA memory.

Reply 137 of 217, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie
2mg wrote on 2023-05-28, 15:25:

Slightly offtopic, buy why not skip FX for GF6 series?

Seem to work in W98, still in P4 era, support pallete/fog...

GF6 dropped a lot of legacy features (I think it does not support either of those) and requires even more modern drivers which doesn't work with certain games. Sometimes even the second edition FX cards which don't work with 40 series drivers have issues.

sreq.png retrogamer-s.png

Reply 138 of 217, by Gmlb256

User metadata
Rank l33t
Rank
l33t

^ True, later nVidia drivers for Windows 9x had stability issues too.

VIA C3 Nehemiah 1.2A @ 1.46 GHz | ASUS P2-99 | 256 MB PC133 SDRAM | GeForce3 Ti 200 64 MB | Voodoo2 12 MB | SBLive! | AWE64 | SBPro2 | GUS

Reply 139 of 217, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
2mg wrote on 2023-05-28, 15:25:

Slightly offtopic, buy why not skip FX for GF6 series?

Seem to work in W98, still in P4 era, support pallete/fog...

GeForce 6 series dropped support for paletted textures. It only supports table fog.

Also, as others have mentioned, newer drivers that these cards need have compatibility issues with certain games. More info on that here: NVIDIA GeForce FX driver testing on an Intel 440EX summary and report

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi