VOGONS


GeForce 4 vs. GeForce FX?

Topic actions

Reply 40 of 217, by Kahenraz

User metadata
Rank l33t
Rank
l33t

Those are some very impressive lows for what should have been an average comparison between generations.

Reply 41 of 217, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie
DrLucienSanchez wrote on 2021-09-01, 13:04:
Well if you're sold on an FX for nGlide, then appear to be as confused as I am. […]
Show full quote

Well if you're sold on an FX for nGlide, then appear to be as confused as I am.

Ti4200 installed, 45.23 drivers, 98SE, DirectX9, NGlide 2.10, no other cards present, no PCI Voodoo etc, just the Ti4200. Launched Undying with nGlide, and it worked. No idea how, it did, if anyone can clarify this or give an explanation then that'll be good. Couldn't switch to 32bit colours, lamps on the game are glowing, which I believe on D3D doesn't work, so unless it's a convincing bug and I'm running D3D mode and somehow getting some benefits of Glide then I don't know what to say.

But regardless FX5900 is a recommendation from me overall.

*Edit* I thought I'd then test the MX440 128Mb AGP, launched, it then just a mess of graphical artifacts when selected Glide API , no discernible images, just a blue screen with blocks.

So looks like nGlide compatible with Geforce 4 Ti, as well as FX, latest nGlide, within 98SE. If anyone can also test as well, that would be interesting.

(below images taken via the Ti4200)

There is a minor problem with nGlide for Undying. -- the mirror does not have reflection. On the 1st level after meeting with Jeremiah you can go into a room, staring at the mirror and the ghost show up in the mirror, sending a cold shiver down one's neck. 😉 So far, only dgVoodoo2 Glide renders mirror reflection correctly, the rest don't. You can check out Direct3D render, since WineD3D also renders the mirror reflection correctly on QEMU VMs. This is an early fantastic experience for horror-themed game. You would miss it if you had played with nGlide.

The attachment Undying_mirror.png is no longer available

Reply 42 of 217, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

Are rearview mirrors working in driving games? NFS series etc

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 43 of 217, by Kahenraz

User metadata
Rank l33t
Rank
l33t
kjliew wrote on 2021-09-02, 02:04:

There is a minor problem with nGlide for Undying. -- the mirror does not have reflection.

I've never played this game; can you clarify? You said no reflection then posted a screenshot with a reflection. Is this the way it's supposed to look or is it wrong? Do you mean that there is no reflection of the GHOST you mention later?

Reply 44 of 217, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie
Kahenraz wrote on 2021-09-02, 03:31:
kjliew wrote on 2021-09-02, 02:04:

There is a minor problem with nGlide for Undying. -- the mirror does not have reflection.

I've never played this game; can you clarify? You said no reflection then posted a screenshot with a reflection. Is this the way it's supposed to look or is it wrong? Do you mean that there is no reflection of the GHOST you mention later?

The screenshot showed the correct rendering from dgVoodoo2 Glide or WineD3D from QEMU VMs. 😉 I could also post the screenshot with the GHOST if you would like to see that. 😁 The message is one would be missing some aspects of horror game experience if the game was played with nGlide due to missing mirror reflection.

Reply 45 of 217, by Kahenraz

User metadata
Rank l33t
Rank
l33t

I just mean that I have no frame of reference. Does it reflect the scene but not the player?

Can you provide a picture of it rendering incorrectly?

Reply 46 of 217, by kjliew

User metadata
Rank Oldbie
Rank
Oldbie

It would just be an all-black looking into the mirror.

Reply 47 of 217, by vorob

User metadata
Rank Oldbie
Rank
Oldbie

It’s a problem of nglide on fx or on modern video cards also? Cause I though Zeus fixed absolutely everything in glide emulation 😀

Reply 48 of 217, by GL1zdA

User metadata
Rank Oldbie
Rank
Oldbie
Desomondo wrote on 2021-09-01, 23:18:

Some quick and dirty benchmarks if you are still interested. Unfortunately the FX series card I had was only a 5600, not the Ultra as I had originally thought.

Thanks for these benchmarks. They made me look at the theoretical performance of these chips and it confirms the benchmarks. The core clocks are from Wikipedia, performance is based on what I think it is, especially the 5200/5600 chips, which vertex performance doesn't match the config core wikipedia shows (either they have 1 VS or the 2 VSs with half performance, if anyone has a reliable source for this please post it).

| Chip            | Code | Core | Config | MPix/s | MTex/s | MVert/s |
|-----------------|------|------|--------|--------|--------|---------|
| Ti4200 | NV25 | 250 | 4:2:8 | 1000 | 2000 | 125.00 |
| Ti4400 | NV25 | 275 | 4:2:8 | 1100 | 2200 | 137.50 |
| Ti4600 | NV25 | 300 | 4:2:8 | 1200 | 2400 | 150.00 |
| FX 5200 | NV34 | 250 | 4:1:4 | 1000 | 1000 | 62.50 |
| FX 5200 Ultra | NV34 | 325 | 4:1:4 | 1300 | 1300 | 81.25 |
| FX 5500 | NV34 | 270 | 4:1:4 | 1080 | 1080 | 67.50 |
| FX 5600 | NV31 | 325 | 4:1:4 | 1300 | 1300 | 81.25 |
| FX 5600 Ultra | NV31 | 350 | 4:1:4 | 1400 | 1400 | 87.50 |
| FX 5600 Ultra 2 | NV31 | 400 | 4:1:4 | 1600 | 1600 | 100.00 |
| FX 5700 | NV36 | 425 | 4:3:4 | 1700 | 1700 | 318.75 |
| FX 5700 Ultra | NV36 | 475 | 4:3:4 | 1900 | 1900 | 356.25 |
| FX 5800 | NV30 | 400 | 4:2:8 | 1600 | 3200 | 200.00 |
| FX 5800 Ultra | NV30 | 500 | 4:2:8 | 2000 | 4000 | 250.00 |
| FX 5900 | NV35 | 400 | 4:3:8 | 1600 | 3200 | 300.00 |
| FX 5900 Ultra | NV35 | 450 | 4:3:8 | 1800 | 3600 | 337.50 |
| FX 5950 Ultra | NV38 | 475 | 4:3:8 | 1900 | 3800 | 356.25 |

Config: PS : VS : TMU

getquake.gif | InfoWorld/PC Magazine Indices

Reply 49 of 217, by Kahenraz

User metadata
Rank l33t
Rank
l33t

Wow. The texture fill rate on anything less than the 5800 is terrible compared to the GeForce 4 Ti series. I'm also surprised to see how similar the 5200 and 5600 are. It basically comes down to clock speed.

Reply 50 of 217, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

I'm also surprised to see how similar the 5200 and 5600 are. It basically comes down to clock speed.

5200 was stripped from FX features which optimize memory bandwidth and fillrate.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 51 of 217, by Ydee

User metadata
Rank Oldbie
Rank
Oldbie
BitWrangler wrote on 2021-09-01, 17:20:

Have you got onboard graphics on those two boards? Just thinking switching to gf2 or Via unichrome onboard might give more range downwards, '99 and prior.

Yes, it's actually a KM400A board, which is basically a KT400A with S3 UniChrome graphics. But it's too weak for gaming, 1 pipeline, not T&L, shared memory. Designation = office machine and 2D. I have a dedicated S3 Savage4 for the few games that support the API S3 MeTaL. I have also Voodoo3 for Glide games. Now I need to decide which card of the two generations (GF4 and GF FX) to choose for D3D and DX in W98SE.

pentiumspeed wrote on 2021-09-01, 17:38:

What is this game?
Cheers,

Is it mainly games for a maximum of DX8.1 from, say, 1997-2001? At random MoH, CoD, NFS, NOLF, HL, RTCW etc.

Reply 52 of 217, by vorob

User metadata
Rank Oldbie
Rank
Oldbie

I'm a notebook guy, I've got Dell D800 with GF Ti200 and Toshiba 5205-705 with GF 5600FXgo. For me, they are pretty close, but I go for FX one because of DX9 features. Yes, they are slow, yes you can't have good fps in some games, but still it's better than not having these features at all. For example, I can play DOOM 3 on both of these laptops with, let's say, an acceptable framerate. But GF 4 doesn't have shader effects on windows while GF FX has them, and they are nice.

And again, FX gives nGlide support, and this enriches game library with glide titles 😀

Reply 53 of 217, by Kahenraz

User metadata
Rank l33t
Rank
l33t

I found a great article where it found found that the GeForce 4 Ti 4200 easily won against the FX 5600 at release.

https://techreport.com/review/5103/nvidias-ge … ce-fx-5600-gpu/

If I'm not mistaken, I believe the FX 5600 Ultra was positioned as the successor to the Ti 4200. But I'm not so sure a small boost in clock speed is enough close the gap shown in this article.

Reply 54 of 217, by BitWrangler

User metadata
Rank l33t++
Rank
l33t++

Yah I skipped the whole FX nonsense because of how sad it looked at release back in the day, then ATI acted like they were the only real dx9 game in town and kept their prices really jacked. Then nV were playing catchup with the 6xxx and ATI didn't seem to try real hard, but still up there, I guess finally by the 7xxx nV was making headway but weren't gonna sell them cheap by then anyway, and I guess I got a case of the friggitalls and with games being all online activation or subscription model by then I lost interest.

Unicorn herding operations are proceeding, but all the totes of hens teeth and barrels of rocking horse poop give them plenty of hiding spots.

Reply 55 of 217, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Don't forget the 3dmark2003 driver controversy.

apsosig.png
long live PCem

Reply 56 of 217, by Gmlb256

User metadata
Rank l33t
Rank
l33t
BitWrangler wrote on 2021-09-03, 04:01:

Yah I skipped the whole FX nonsense because of how sad it looked at release back in the day, then ATI acted like they were the only real dx9 game in town and kept their prices really jacked. Then nV were playing catchup with the 6xxx and ATI didn't seem to try real hard, but still up there, I guess finally by the 7xxx nV was making headway but weren't gonna sell them cheap by then anyway, and I guess I got a case of the friggitalls and with games being all online activation or subscription model by then I lost interest.

I wouldn't really bother with a GeForce FX card, nVidia lost the leadership there until the GeForce 8800.

leileilol wrote on 2021-09-03, 04:04:

Don't forget the 3dmark2003 driver controversy.

Yeah, I can't still believe that nVidia was capable of going that far with cheating, especially with such reputable benchmark.

VIA C3 Nehemiah 1.2A @ 1.46 GHz | ASUS P2-99 | 256 MB PC133 SDRAM | GeForce2 GTS 32 MB | Voodoo2 12 MB | SBLive! | AWE64 | SBPro2 | GUS

Reply 57 of 217, by mockingbird

User metadata
Rank Oldbie
Rank
Oldbie
BitWrangler wrote on 2021-09-03, 04:01:

Yah I skipped the whole FX nonsense because of how sad it looked at release back in the day, then ATI acted like they were the only real dx9 game in town and kept their prices really jacked. Then nV were playing catchup with the 6xxx and ATI didn't seem to try real hard, but still up there, I guess finally by the 7xxx nV was making headway but weren't gonna sell them cheap by then anyway, and I guess I got a case of the friggitalls and with games being all online activation or subscription model by then I lost interest.

Gimmick is what put ATI on par with nVidia when Radeon came out.

The fact is that because of the Radeon's design, the performance penalty of using 32-bit color is not ameliorated when dropping to 16-bit color.

The FX5800 or better will trounce the 9700Pro in 16-bit.

So couple that with "blowergate", and that's why GeForce FX did not do well. And the FX cooling requirements is lower than that of an equivalent Radeon (that means to say - they run cooler). So pushing the cards too hard at the expense of requiring an outrageous cooling solution was a blunder on their part.

Looking back at things however, I would gladly give up "real" DirectX 9 support for the raw performance advantage of the GeForce FX. I would also gladly drop to 16-bit color if I could play at 1600x1200, as opposed to being stuck at lower resolutions for the "privilege" of 32-bit color.

That marketing nonsense is also what drove a wedge in 3dfx's profits margins at the time. All the shills kept attacking 16-bit color... "Oh no, it's the end of the world, we are stuck with 16-bit color"... Yea, 3dfx going bankrupt is far more preferable. The shills who ran the hardware sites back in the day know who they are, and they're still around.

Not to mention that Radeons of that era have a much higher failure rate.

I've purchased 5 Quadro 1000 cards (FX 5800). They all work perfectly.

mslrlv.png
(Decommissioned:)
7ivtic.png

Reply 58 of 217, by RandomStranger

User metadata
Rank Oldbie
Rank
Oldbie

@mockingbird
https://youtu.be/uq-v1TTUyhM

sreq.png retrogamer-s.png

Reply 59 of 217, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
Kahenraz wrote on 2021-09-03, 03:53:

I found a great article where it found found that the GeForce 4 Ti 4200 easily won against the FX 5600 at release.

https://techreport.com/review/5103/nvidias-ge … ce-fx-5600-gpu/

If I'm not mistaken, I believe the FX 5600 Ultra was positioned as the successor to the Ti 4200. But I'm not so sure a small boost in clock speed is enough close the gap shown in this article.

Depends on settings, as usual.
The gain on memory clock of 5600 Ultra is not small.