VOGONS


First post, by pentiumspeed

User metadata
Rank l33t
Rank
l33t
Kahenraz wrote on 2022-06-13, 18:49:
I think this is a mistake. The NV18 is the GeForce 4 MX 440 AGP x8. The NV18C is the MX 4000. […]
Show full quote

I think this is a mistake. The NV18 is the GeForce 4 MX 440 AGP x8. The NV18C is the MX 4000.

Also, the defining factor between the FX 5200 and the 5600 isn't the clock speed but rather the lack of color and Z compression. This was a terrible bastardization in the name of product segmentation. The 5200 should never have existed and the 5600 should have been the 5200. The 5200 Ultra basically cranks the clocks up to 11 to try and compensate for its own self crippling design.

https://techreport.com/review/5257/nvidias-ge … 5200-ultra-gpu/

No color or Z compression – Unlike the rest of the NV3X line, NV34 can’t do color or Z compression. The lack of color compression should hamper the chip’s performance primarily with antialiasing enabled, but the lack of Z compression will hurt across the board. Without advanced lossless compression schemes, NV34 doesn’t make as efficient use of the bandwidth it has available, which reduces the chip’s overall effective fill rate (or pixel-pushing power).

The 5600 was a good budget card and the 5600 Ultra was competitive with the GeForce 4 Ti 4600. The 5200 is fine. I like the 5200. It's similar to the GeForce 4 MX but has shader capability. This can be desirable. But it's such a bizarre product.

The absolute worst offense is the GeForce FX 5200 with a 64-bit bus. I don't say this often, but this card is truly garbage. I wonder if a GeForce 2 MX would be faster.

I re-read the wiki about Geforce FX series, they glossed over like a marketing burb with technical details left out. Just like above is what *exactly* I wanted to know.

By the way what differences about the FX 5700, and also FX 5800 and 5900 series?

What FX 5700 got number of hardware changes? Also what makes FX 5700 *requires* newer driver that breaks compatibility with older games?

I get beginning of understanding that people mark FX 5xxx series as bastardized chipset changes compared to before and after.

Thanks and cheers,

Great Northern aka Canada.

Reply 1 of 23, by Kahenraz

User metadata
Rank l33t
Rank
l33t

The only major difference between the 5700, 5800, and 5900 et all are how many processing units they have. I've read about some quote "improvements" made to the silicon between versions, etc, but it's all pretty marginal. Once you get past the nonsense that is the 5200/Ultra/5500, performance is pretty linear.

The most important thing to look out for are any cards with a 64-bit bus. Just avoid those, except for curiosity. They can be found in the 5200, 5600, and 5700 series. All 5800s are 128-bit. All 5900s are 256-bit. DDR, GDDR2, and GDDR3 was used throughout, but the differences aren't worth it for the FX series much outside of marketing and can be ignored. The only time memory had a profound effect was the 5200 Ultra, because of all the wasted bandwidth due to its lack of compression.

I've heard a lot of people talk about driver version 45.23 being the "most compatible". I've had good results testing up through 56.64, and haven't seen any use case where the 45.23 had any advantage, compatibility or otherwise. It's after this version where things start to break. If you want to use the earliest 45.23, then you have access only to the 5200, 5600, 5800 and 5900. Many variants of the 5900 series, including the 5950 will work with the 5900 drivers. I think some XT cards have problems with these older drivers, but I can't recall to confirm. Ultra variants will probably work as well if forced to use their lower specced counterpart, since clocks are part of their BIOS and not set by the drivers. The only cards that can't be used with the earliest drivers are the 5700 series.

Any 5900Z/S/XT/Ultra is a good pick, and I would recommend picking one up if you can find one for under $100. All of the variations in the 5900 series are extremely minor, and there isn't much of a difference unless you compare the slowest 5900 versus the fastest 5950 Ultra. But this gap can be easily be closed with an overclock.

The FX 5200 series is, in my opinion, and excellent budget card. You can also find them quite cheap and with a great board layout as the Quadro FX 500. They are not fast, but when paired with a slower CPU, you are often no longer GPU bound and simply acquire a nice set of extra features. The FX 5600 is a better 5200 and the only low end worth considering if your CPU is fast enough to be paired with it. The 5700 is a bit faster, and is a luxury improvement. But they can be expensive and something like the Radeon 9600 would be both faster and cheaper. The 5800 is not that special, but it is iconic. It's a collector's item and they're hard to get. The 5900 et all are still very easy to find, and are the crown of the FX series.

Again, I want to reiterate that, despite my bashing, the FX 5200 is not a bad card. It just has an extremely narrow use case where there are often better alternatives, or the benefits of the FX 5200 are simply not very pronounced. I think they are a great addition to any collection, as cheap as they are. Just be sure to test it against a GeForce 4 MX 440, if you plan to build a low-spec system, as it may or may not be faster. For example, the FX 5200 is just as fast as the 5950 Ultra on an Intel 440EX with a Mendocino (pre-Coppermine) CPU. But the GeForce 4 MX 440 is much faster.

See here for some additional information on my past driver studies on the FX series:

NVIDIA GeForce FX driver testing on an Intel 440EX summary and report

Reply 2 of 23, by Repo Man11

User metadata
Rank Oldbie
Rank
Oldbie

Testing a 128 bit 256 meg FX 5200 and a 64 bit 128 meg FX 5500 on my TXP4 with a K6-3+ @ 500 MHz I found that they are so CPU bound on this system that there's only a slight difference between the two. The 64 bit FX 5500 actually had the edge in 3D 2000 because it would work with the 44.03 driver where the FX 5200 wouldn't.

"A lot of times when you first start out on a project you think, This is never going to be finished. But then it is, and you think, Wow, it wasn't even worth it." - Jack Handey

Reply 3 of 23, by Kahenraz

User metadata
Rank l33t
Rank
l33t

The code path for the FX series if the CPU has MMX but not SSE is often crippled, as indicated by my tests with Mendocino vs Coppermine. I don't know if there is a separate code that for 3DNow! instructions to compensate, but on these CPUs, previous generations are usually faster. The easiest way to test this is to run Unreal and see if it lags horribly in the flyby demo. This demo should be smooth as silk on the 5200. If it's not, then it's not a good pairing with your processor.

I found the opposite problem with all of the GeForce cards for PCI (GeForce 2 MX and 4 MX), even when paired with a very fast processor (Pentium D) where Unreal would lag horribly but be buttery smooth with the FX 5200. I don't recall if you need older drivers as well; this may be what the problem was. Actually, yes. I think it was. You just couldn't use a driver old enough to get past the point whey started lagging with the FX series. I think it was version 23.11 or 28.32.

It's always a safe bet to test the GeForce 4 MX against an FX 5200, no matter the platform, if you plan to go low spec. I'm always surprised by this.

Reply 5 of 23, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

The change from 5800 to 5900 was from 128bit bus and ddr2 to 256bit and ddr.

FX 5700 and FX 5900 share some architecture improvements not available to vanilla FX.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 6 of 23, by AlexZ

User metadata
Rank Member
Rank
Member
Kahenraz wrote on 2022-06-14, 02:40:

I've heard a lot of people talk about driver version 45.23 being the "most compatible". I've had good results testing up through 56.64, and haven't seen any use case where the 45.23 had any advantage, compatibility or otherwise.

There are definitely some issues with 5x.xx drivers. Re: HELP! - GeForce FX5600 in win98se

I got error "direct 3d device does not accurately report texture memory usage" with 53.04/56.64 driver when starting first mission in Thief 2. With 45.23 driver it works fine. I would highly recommend to everyone to use 45.23 and not waste time with experiments. 5x.xx is only a necessity for FX 5700 which makes it less desirable then the rest of FX line.

I used 53.04 for some time and it did seem to have good compatibility. But it isn't as good as 45.23.

Pentium III 900E, ECS P6BXT-A+, 384MB RAM, NVIDIA GeForce FX 5600 128MB, Voodoo 2 12MB, 80GB HDD, Yamaha SM718 ISA, 19" AOC 9GlrA
Athlon 64 3400+, MSI K8T Neo V, 1GB RAM, NVIDIA GeForce 7600GT 512MB, 250GB HDD, Sound Blaster Audigy 2 ZS

Reply 7 of 23, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

Always check core and memory frequency card was tested at :

3dma01se.png
Filename
3dma01se.png
File size
29.74 KiB
Views
537 views
File license
Fair use/fair dealing exception

Driver/OS : 93.71/WinXP x86 SP3
CPU : Phenom II 965 @ 3,6GHz (2,4GHz NB)
RAM : 2x2GB 800MHz CL4.4.4.12
MB : AM2NF3-VSTA (BIOS "P3.30")

157143230295.png

Reply 8 of 23, by Kahenraz

User metadata
Rank l33t
Rank
l33t

That's a really good chart. It also shows that the 5200 is still very competitive compared to the 5600, except for those special situations where it is not.

It's difficult to get a complete picture of the FX family, because of how segmented it was, with all of the variations on bus width.

I never realized that this was a thing until very recently:

Bizarre GeForce 4 MX 420 PCI is faster than my GeForce 4 MX 440 AGP 8x

Reply 9 of 23, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie

Thank you, but about that 5200 vs. 5600 being close... (here's frequency matched) :

Fillrate benchmark.png
Filename
Fillrate benchmark.png
File size
63.2 KiB
Views
525 views
File license
Fair use/fair dealing exception

5200 drops the ball vs. 5600, where Pixel Fillrate is important/needed (no color compression on former is really painfull).

157143230295.png

Reply 10 of 23, by Kahenraz

User metadata
Rank l33t
Rank
l33t

Yes, it's true. You can compensate for it somewhat by lowering the resolution. 640x480 and 800x600 is still pretty good for a lot of older titles. 1024x768 can be much more challenging for later titles though.

It's a good card if you use it for older titles. It never was of any value for DirectX 8 or 9, other than for the novelty.

Another use case is to use it for additional display outputs for multiple monitors. There's no shame in using it for that.

Reply 11 of 23, by agent_x007

User metadata
Rank Oldbie
Rank
Oldbie
AlexZ wrote on 2022-06-17, 19:47:

I used 53.04 for some time and it did seem to have good compatibility. But it isn't as good as 45.23.

Watch out for that 45.23 on FX series, they were released in middle of the whole "NV cheating" thing back in the day : LINK

157143230295.png

Reply 12 of 23, by j^aws

User metadata
Rank Oldbie
Rank
Oldbie
pentiumspeed wrote on 2022-06-13, 21:30:
I re-read the wiki about Geforce FX series, they glossed over like a marketing burb with technical details left out. Just like […]
Show full quote
Kahenraz wrote on 2022-06-13, 18:49:
I think this is a mistake. The NV18 is the GeForce 4 MX 440 AGP x8. The NV18C is the MX 4000. […]
Show full quote

I think this is a mistake. The NV18 is the GeForce 4 MX 440 AGP x8. The NV18C is the MX 4000.

Also, the defining factor between the FX 5200 and the 5600 isn't the clock speed but rather the lack of color and Z compression. This was a terrible bastardization in the name of product segmentation. The 5200 should never have existed and the 5600 should have been the 5200. The 5200 Ultra basically cranks the clocks up to 11 to try and compensate for its own self crippling design.

https://techreport.com/review/5257/nvidias-ge … 5200-ultra-gpu/

No color or Z compression – Unlike the rest of the NV3X line, NV34 can’t do color or Z compression. The lack of color compression should hamper the chip’s performance primarily with antialiasing enabled, but the lack of Z compression will hurt across the board. Without advanced lossless compression schemes, NV34 doesn’t make as efficient use of the bandwidth it has available, which reduces the chip’s overall effective fill rate (or pixel-pushing power).

The 5600 was a good budget card and the 5600 Ultra was competitive with the GeForce 4 Ti 4600. The 5200 is fine. I like the 5200. It's similar to the GeForce 4 MX but has shader capability. This can be desirable. But it's such a bizarre product.

The absolute worst offense is the GeForce FX 5200 with a 64-bit bus. I don't say this often, but this card is truly garbage. I wonder if a GeForce 2 MX would be faster.

I re-read the wiki about Geforce FX series, they glossed over like a marketing burb with technical details left out. Just like above is what *exactly* I wanted to know.

By the way what differences about the FX 5700, and also FX 5800 and 5900 series?

What FX 5700 got number of hardware changes? Also what makes FX 5700 *requires* newer driver that breaks compatibility with older games?

I get beginning of understanding that people mark FX 5xxx series as bastardized chipset changes compared to before and after.

Thanks and cheers,

Just an FYI: I can 't find a discussion I read on the web years ago - it was about tech differences around NV35 (FX/ PCX 5900) and NV38 (FX/ PCX 5950) and image quality from video playback, where transistor counts were removed from the NV35 die. This was done to squeeze out clockspeeds for the NV38 die.

Reply 14 of 23, by j^aws

User metadata
Rank Oldbie
Rank
Oldbie
The Serpent Rider wrote on 2022-06-18, 17:14:

NV38 can't do hardware video decoding?

No, the transistor count specifically related to the quality of video playback was reduced from the NV35 die. And this reduction was used to boost clockspeeds for the NV38 die.

Reply 15 of 23, by pentiumspeed

User metadata
Rank l33t
Rank
l33t

There are NV30, 31 and 34 and 36. What about these for video playback?

The accelerated video decoder playback was introduced later after geforce 6? I know they introduced starting with geforce 4 mx series using partial support to accelerate the MPEG2 "VPE".

Cheers,

Great Northern aka Canada.

Reply 16 of 23, by Kahenraz

User metadata
Rank l33t
Rank
l33t

I've never had a particular build or encountered a use case where this was an issue. Maybe it's more for DVD playback or something. I think that you would need an especially slow CPU for this to be an issue.

Reply 17 of 23, by swaaye

User metadata
Rank l33t++
Rank
l33t++

The MPEG2 acceleration is mostly for notebooks anyway. Fixed function decode uses far less power than having a CPU decode it so it increases battery life. Nvidia didn't even bother with full MPEG2 decode until GF4MX which was a very notebook oriented chip.

That said I don't know offhand if any of this talk about differences among FX cards is accurate. I could see it not being a priority with a 59x0 though.

Reply 18 of 23, by Kahenraz

User metadata
Rank l33t
Rank
l33t

Video processing has always been a budget feature that was removed from higher end products, as speculated for higher clocks. I suspect that it still made it into the FX 5700 because it was also used in the Personal Cinema product line.

https://en.m.wikipedia.org/wiki/Video_Processing_Engine

Also search for "VPE" here for a nice little chart for the GeForce 4 series. I couldn't find a chart like this for the FX series, but the previous article mentions several models.

https://en.m.wikipedia.org/wiki/List_of_Nvidi … rocessing_units

Reply 19 of 23, by The Serpent Rider

User metadata
Rank l33t
Rank
l33t

I found it hard to believe that NV38 were something more than rebadged NV35 chips. FX5950 was released quite haphazardly to counter 9800XT, there was simply no time to remove something from silicon. It's much easier to just bump voltage on existing chip and/or stockpile chips with better yields (7800GTX 512, 8800 Ultra).

I must be some kind of standard: the anonymous gangbanger of the 21st century.