VOGONS


Reply 20 of 97, by Baoran

User metadata
Rank l33t
Rank
l33t

I never understood how they were able to more than double the transistor count from geforce 4 series to FX series and manage not to really improve performance much.

Reply 21 of 97, by appiah4

User metadata
Rank l33t++
Rank
l33t++
Baoran wrote:

I never understood how they were able to more than double the transistor count from geforce 4 series to FX series and manage not to really improve performance much.

Are you talking about Kepler? Or RTX2080? Thats nVidia for you.

Retronautics: A digital gallery of my retro computers, hardware and projects.

Reply 22 of 97, by SmokyWhisky

User metadata
Rank Newbie
Rank
Newbie

Kepler was a lot more power efficient than Fermi though, and the RTX Turing chips are sort of a sidegrade for new features instead of an upgrade. With that said, there's no denying the current prices are ridiculous (I still remember when you could get the absolute latest and best gaming GPU for 500-600€) and the RT and tensor cores aren't very useful right now for most end users. Hopefully one day they'll have some real competition in the high end again.

Reply 23 of 97, by raindog1975

User metadata
Rank Newbie
Rank
Newbie

I still own a FX 5500 AGP and it was a decent card coupled with a P4 Northwood . It did have a craptastic cooler witch I replaced almost immediately, and the drivers were junk ( some games would play just fine, others crashed constantly and the frame rates varied a lot on games with the same graphic engine ).

Pentium MMX 166 Acorp 5VIA3P 32 MB EDO-Ram Matrox Millennium 4 MB -Win 95 OSR 2
AMD K6-2 400 Matsonic MS6260S 128 MB SDRAM PC 100 TNT2 M64 32 MB - Win 98 SE
AMD Athlon 1GHz Soltek SL-75 KAV 256 MB SDRAM PC 133 Geforce 6800 GT / FX 5500 - work in progress

Reply 24 of 97, by doaks80

User metadata
Rank Member
Rank
Member

What hate? Back in the day maybe, but they are prized gems today. Look at the price FX59XX ultras go for on Ebay.

The reason they got hate back in the day as they were nominally supposed to be DX9 GPUs....which they were technically, just not very good ones, compared to ATI at least. If nvidia released a GPU today that had crap DX11/DX12 support but great DX9/10 support it would get a lot of understandable hate even thought it might make a great WinXP GPU in 10 years time.

No-one cares about it's DX9 capabilities now....we value the FX for its DX8 and lower capabilities...which are awesome. The very best in fact considering the fog table support.

If you want a good DX9 GPU for WinXP get a Radeon 5850 or similar.

k6-3+ 400 / s3 virge DX+voodoo1 / awe32(32mb)
via c3 866 / s3 savage4+voodoo2 sli / audigy1+awe64(8mb)
athlon xp 3200+ / voodoo5 5500 / diamond mx300
pentium4 3400 / geforce fx5950U / audigy2 ZS
core2duo E8500 / radeon HD5850 / x-fi titanium

Reply 25 of 97, by candle_86

User metadata
Rank l33t
Rank
l33t

there where crap, I had an FX5200 128bit card, sure it played DX9 games, if you like your games in slideshow format. I remember playing farcry on it by forcing DX7 mode, same with Half Life 2. I replaced my MX440 with an FX5200, and was not impressed nothing really looked better thanks to the settings I was forced to run with. It got better when they launched the 6200AGP which i prompty bought, so much better.

Reply 26 of 97, by Duouk2000

User metadata
Rank Member
Rank
Member

I remember the joys of "playing" Half Life 2 and Far Cry on my 5200FX as well haha. My 6600GT died though and it was all I could afford.

It was good enough for Baldurs Gate 2 and KOTOR though, I'm actually quite fond of it 😀

Reply 27 of 97, by bakemono

User metadata
Rank Oldbie
Rank
Oldbie

NV played a dirty trick with the naming. In the GF4 generation there were basically two GPUs, the MX and the Ti. The GeForce 4 Ti were almost all the same except for clock rates and RAM. For the FX lineup they did away with the "MX" label and launched the 5200 and 5600 which were considerably weaker than the 5800. Anyone who thought that an FX 5200 would be an upgrade to a Ti 4200 was sorely disappointed, as the 5200 was much closer to a GeForce 4MX.

I had a 5200 (and later a 5700) and though it wasn't the fastest, the multi-monitor and TV-out worked well, and of course it was still light-years beyond Intel graphics.

Reply 28 of 97, by candle_86

User metadata
Rank l33t
Rank
l33t
bakemono wrote:

NV played a dirty trick with the naming. In the GF4 generation there were basically two GPUs, the MX and the Ti. The GeForce 4 Ti were almost all the same except for clock rates and RAM. For the FX lineup they did away with the "MX" label and launched the 5200 and 5600 which were considerably weaker than the 5800. Anyone who thought that an FX 5200 would be an upgrade to a Ti 4200 was sorely disappointed, as the 5200 was much closer to a GeForce 4MX.

I had a 5200 (and later a 5700) and though it wasn't the fastest, the multi-monitor and TV-out worked well, and of course it was still light-years beyond Intel graphics.

But it wasn't priced as the ti 4200 replacement.

Mx420/MX4000 = FX5200 64bit
Mx440 = FX5200
MX460 = FX5200 ULTRA/FX 5600
TI 4200 = FX5600 ULTRA
TI 4400 = FX 5800
TI 4600/4800 = FX5800 ULTRA

That's how Nvidia priced them and what they where meant to replace.

But the FX has alot more here I'll list and give * to the decent ones that made sense for gaming in 2003, the rest consistently under preformed their compitors aka rv250/r300 series. All where terrible at dx9 but that wasn't actually relevant until 2004 and gf6 was out anyway.

FX5200 64BIT
FX5200
FX5200 ULTRA
PCX5300
FX5500 64BIT
FX5500
FX5600SE/XT
FX5600
FX5600 ULTRA *
FX5700 LE/SE/XT
FX5700 *
PCX5700 *
FX5700 ULTRA *
FX5700 ULTRA GDDR3 *
FX5800
FX5800 ULTRA
FX 5900 SE/XT *
FX5900 *
PCX5900 *
FX5900 ULTRA *
FX5950 ULTRA *

Reply 29 of 97, by doaks80

User metadata
Rank Member
Rank
Member
candle_86 wrote:
FX5900 * PCX5900 * FX5900 ULTRA * FX5950 ULTRA * […]
Show full quote

FX5900 *
PCX5900 *
FX5900 ULTRA *
FX5950 ULTRA *

I'ts a myth there was a FX5900 and a FX5900 Ultra (same with 5800). The FX5900 *was* the 5900 Ultra. Lower models were the XT, ZT etc. The wiki page reflects this: https://en.wikipedia.org/wiki/GeForce_FX_series

k6-3+ 400 / s3 virge DX+voodoo1 / awe32(32mb)
via c3 866 / s3 savage4+voodoo2 sli / audigy1+awe64(8mb)
athlon xp 3200+ / voodoo5 5500 / diamond mx300
pentium4 3400 / geforce fx5950U / audigy2 ZS
core2duo E8500 / radeon HD5850 / x-fi titanium

Reply 30 of 97, by candle_86

User metadata
Rank l33t
Rank
l33t
doaks80 wrote:
candle_86 wrote:
FX5900 * PCX5900 * FX5900 ULTRA * FX5950 ULTRA * […]
Show full quote

FX5900 *
PCX5900 *
FX5900 ULTRA *
FX5950 ULTRA *

I'ts a myth there was a FX5900 and a FX5900 Ultra (same with 5800). The FX5900 *was* the 5900 Ultra. Lower models were the XT, ZT etc. The wiki page reflects this: https://en.wikipedia.org/wiki/GeForce_FX_series

Saw them at the time on newegg and tiger direct for the FX5900, as for the FX 5800 non Ultra it certainly does exist, and there are members here with the cards.

You can also see the Techpowerup GPU database here
https://www.techpowerup.com/gpu-specs/?mfgr=N … =9.0a&sort=name

Considering Wizzard works closely with Nvidia to detect GPU's and model information his database would be accurate

ex FX5900 non Ultra ASUS V9950
https://hexus.net/tech/reviews/graphics/585-a … geforce-fx5900/

Reply 31 of 97, by silikone

User metadata
Rank Member
Rank
Member

Didn't the FX have a few things going for it that while underwhelming in practice, could hypothetically deliver results that would confidently kick its competition off the throne? Ati going all in with SM 2.0 turned out to be the right choice, as developers understandably had an aversion to maintaining branching rendering paths.

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 32 of 97, by Putas

User metadata
Rank Oldbie
Rank
Oldbie
Baoran wrote:

I never understood how they were able to more than double the transistor count from geforce 4 series to FX series and manage not to really improve performance much.

5800 Ultra doubled performance of 4600.

Reply 33 of 97, by candle_86

User metadata
Rank l33t
Rank
l33t
silikone wrote:

Didn't the FX have a few things going for it that while underwhelming in practice, could hypothetically deliver results that would confidently kick its competition off the throne? Ati going all in with SM 2.0 turned out to be the right choice, as developers understandably had an aversion to maintaining branching rendering paths.

GFFX was a big guess, Microsoft was mad at Nvidia and they got zero input on DX9 and got technical documentation on it after they had already tapped the first NV30 die. They immeditally went to designing the NV35 to improve the NV3x design while also having a team start work on NV40.

Reply 34 of 97, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah DX9 Shader Model 2 is basically exactly ATI R300. It's has lots of limitations and so SM 3.0 came rather fast. GeForce FX is almost experimental, with more flexibility but lots of problems. It would be great to hear the inside story on those days.

Reply 35 of 97, by evoportals

User metadata
Rank Newbie
Rank
Newbie

Ok, so what's better than the fx5600 for Windows 98 builds??? Nothing. ATI is out of the question since it lack features for retro gaming, the GeForce 3 and older are all slower. The GeForce 4 to series is a close second but it uses WAY more power and runs hot. Also the fx5600 has superior AA and anioscopic filtering performance.

So please someone explain to me what would be better for Windows 98 and WHY. Don't just say the FX 5600 sucks. Give me reason plz. I am correct here.

Reply 36 of 97, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie
swaaye wrote:

Yeah DX9 Shader Model 2 is basically exactly ATI R300. It's has lots of limitations and so SM 3.0 came rather fast. GeForce FX is almost experimental, with more flexibility but lots of problems. It would be great to hear the inside story on those days.

also something to consider is that the Xbox at the time was really fresh and using an Nvidia GPU, I've read somewhere that it was partly due to the bad relationship with NV that made MS to go the R300 route

in any case SM 2.0 had some relevance and a decent amount of titles that used it.

Reply 37 of 97, by candle_86

User metadata
Rank l33t
Rank
l33t
evoportals wrote:

Ok, so what's better than the fx5600 for Windows 98 builds??? Nothing. ATI is out of the question since it lack features for retro gaming, the GeForce 3 and older are all slower. The GeForce 4 to series is a close second but it uses WAY more power and runs hot. Also the fx5600 has superior AA and anioscopic filtering performance.

So please someone explain to me what would be better for Windows 98 and WHY. Don't just say the FX 5600 sucks. Give me reason plz. I am correct here.

FX5700 Ultra, it has all the improvements that made the 5900 a good card just for the mid-range. Unlike the fx5600 a 5700 actually competes in SM 1.4 with a 9600 pro/XT

Reply 38 of 97, by candle_86

User metadata
Rank l33t
Rank
l33t
SPBHM wrote:
swaaye wrote:

Yeah DX9 Shader Model 2 is basically exactly ATI R300. It's has lots of limitations and so SM 3.0 came rather fast. GeForce FX is almost experimental, with more flexibility but lots of problems. It would be great to hear the inside story on those days.

also something to consider is that the Xbox at the time was really fresh and using an Nvidia GPU, I've read somewhere that it was partly due to the bad relationship with NV that made MS to go the R300 route

in any case SM 2.0 had some relevance and a decent amount of titles that used it.

Basically ATI, SIS, VIA and Intel where invited to help design SM2 but Nvidia was excluded.

Reply 39 of 97, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

I don't see a reason to use any FX 5600 with readily available 5900XT cards. Normal 5600 sucks and 5600 Ultra is a rare beast.

on it after they had already tapped the first NV30 die. They immeditally went to designing the NV35 to improve the NV3x design while also having a team start work on NV40.

Nvidia had real problems with NV30. They wanted true 8 pixel pipes chip with 256-bit bus from day one, but were falling behind the schedule quite considerably. In the end they had to quickly roll out half baked solution with 4 pipes and 128-bit bus and pretty much factory overlclock it, with no headroom left.

I must be some kind of standard: the anonymous gangbanger of the 21st century.