VOGONS


FX5200 vs 9800

Topic actions

Reply 20 of 39, by SPBHM

User metadata
Rank Oldbie
Rank
Oldbie
Nahkri wrote:
m1919 wrote:

The FX5200 is probably one of the worst Nvidia cards ever made.

Depends on what videocard u had before it,i remember going from a geforce 2 mx 400 to a fx 5200 and the difference was huge,both in performance but even more in image quality,a lot of games looked better on the fx.

yes, as a replacement for the Geforce MX series the FX 5200 was quite nice...
back during the Geforce 3/4 days the low cost alternatives from nv (geforce 2MX and geforce 4 mx) had limited features (DX7) while the higher end cards had DX8 support, with the FX 5200 it was all the same... also initially the FX 5200 had fast enough 128bit memory, while later it was almost exclusively sold as a 64bit low clocked card.... the FX 5200 Ultra was also discontinued pretty quickly, maybe that's why some people had far worse experience than others with the "FX 5200", there is a lot of variation within cards with the same name for low end stuff.

but as I said, compared to the 9800, no way... the FX 5200 was at best comparable to the Radeon 8500LE and 9000 PRO.

Reply 21 of 39, by Mau1wurf1977

User metadata
Rank l33t++
Rank
l33t++
d1stortion wrote:

DVI usually only allows for 60 Hz, so if you're fine with every 7th frame being skipped in 320x200 games...

Once the video is on YouTube it's not an issue. It's not like people out there watch videos on 70Hz mode 😀

My website with reviews, demos, drivers, tutorials and more...
My YouTube channel

Reply 22 of 39, by cdoublejj

User metadata
Rank Oldbie
Rank
Oldbie

not worse than the MX4000. bought one once to upgrade my 64mb ati card back in the day, it would immediately bsod if you tried to run ANY 3d application. i promptly took it out side and crushed it after google brought up zilch o nthe topic.

Reply 23 of 39, by obobskivich

User metadata
Rank l33t
Rank
l33t

MX 4000 is decidedly worse than the FX 5200; still makes a nice auxiliary card to add more monitors (at least, it did before Aero and everything being 3D/graphics-intense/etc; even back in ~2005 it wasn't bad though). FX 5200 is something competitive to the Radeon 9000/9100/9200 cards, but claims DX9 support (it will "run" some DX9 games like Halo, Doom 3, or UT2004 with varied results), while the 9800 is up there with the FX 5900/5950 series until SM2.0 comes into play (at which point the 9800's are considerably faster/better).

Give these a gander regarding GF FX vs R300 performance:
http://www.ign.com/articles/2003/10/27/nvidia … 50-ultra-review (5950 is the blue/purple color)
http://www.guru3d.com/articles_pages/geforce_ … _review,13.html (this one has better labeling)
http://www.guru3d.com/articles_pages/ati_rade … _review,11.html

In general it isn't fair to say the R300 series are "decidedly better" or "smash" the NV30 cards - when later DX9 support comes off the table they're pretty competitive, but when later DX9 is considered the R300 has an advantage. For reference here's a (perhaps incomplete) list of games and their PS requirements: http://wikibin.org/articles/list-of-computer- … el-shaders.html

The GF6 series are "better still" once you get into the 6600/6800 arena. But it really depends on what you need the card to actually do for you - if you're just going to be playing DX7/8 games the FX 5200 will be perfectly fine as long as you aren't hoping for everything maxed-out at 2048x1536 in Morrowind or something (it'll run games like UT03/04, WarCraft 3, Empire Earth, ORB, etc with no problems at reasonable resolutions - the 9800 is faster but that won't matter in such a comparison; if you're meaning to run FarCry, Half-Life 2, etc then the 9800 is a better choice). If you aren't gaming at all (or aren't going to be doing 3D gaming), neither is likely to matter worth a hill of beans - get whatever you can for the cheapest price that will support whatever OS you need to support.

If you're going out a-shopping, why not take a look for the AIW X800? Single-slot, high performance DX9, and full VIVO support (if it has the break-out cables).

I'd avoid any of the late-to-market "value" refreshes, nVidia was worse about this during the FX era, but both manufacturers are guilty of it:

GeForce FX 5500, 5700VE, 5700LE
Radeon 9250, 9550, 9600Pro EZ

Reply 25 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Halo is D3D 7-9. It requires 9 to run all of the effects but has D3D 8.1, 8.0 and 7 fallback modes. Xbox is somewhat more flexible than D3D 8.1.

UT2003/2004 are mostly D3D 7. If D3D 8 hardware is present they use pixel shaders on the terrain rendering for a small efficiency gain, I believe.

Reply 26 of 39, by obobskivich

User metadata
Rank l33t
Rank
l33t
d1stortion wrote:

Think you got something mixed up here. Neither Halo nor UT04 are D3D9, they use D3D8, with UT04 having an experimental D3D9 renderer though.

Aye on UT04 (good save - I always think of it as a 9 game mostly due to its age, but yeah "experimental" is the watch-word there), but on Halo it requires DX9 to be installed (includes 9b on the disc) and to enable/use some of the effects, which IME the FX cards can get you (albeit there is a performance hit). It'll run on a GF4 with some settings locked-out and such as long as DX9 is installed though.

Most of the games I mentioned (and have seen mentioned here) are probably 8 or 8.1 or lower (e.g. Morrowind, ORB, Postal 2) just based on the early 2000's era and what a GeForce FX can run well; sure some later games like Half-Life 2 have DX8 fallback, but even that runs pretty poorly on the FX if memory serves (the 5950 might get you out of 800x600, but IMO that game is really beyond that series of cards, and should only be attempted if you really have no alternatives).

Reply 27 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++

I agree that the FX series is a great choice for D3D 7 and various OpenGL games of the time. Particularly OpenGL, since some games used NVIDIA proprietary extensions (see Bioware). You also get fog table and palettized texture support for old D3D5 games.

When games start pushing a lot of pixel shader effects, even D3D 8-class, the FX series doesn't have the arithmetic throughput to keep up. But lots of games back then with D3D 8/9 support were in essence D3D7 games with extras so it doesn't really start to hit hard until 2004-5. Far Cry and Half Life 2 are obvious turning points. Doom3 is an anomaly in that it is pretty much what the NV3x was designed to do.

Reply 28 of 39, by NamelessPlayer

User metadata
Rank Member
Rank
Member
swaaye wrote:

UT2003/2004 are mostly D3D 7. If D3D 8 hardware is present they use pixel shaders on the terrain rendering for a small efficiency gain, I believe.

I'm gonna be honest when I say I never would have noticed that.

To me, the use of pixel shaders really took off with Far Cry, Doom 3 and Half-Life 2, when everything started getting this distinctive plasticky sheen for a bump-mapping shader that completely went away when disabling pixel shaders (if permitted), alongside lens distortion effects, bloom, depth of field, motion blur that was actually blurred and not simply a model leaving translucent trails of itself in its wake, etc. Those were obvious pixel shader effects.

Then there was how Deus Ex: Invisible War wouldn't even run without PS 1.1 support at minimum, which is especially funny considering that it's also UE2 (probably more like UE2.5).

Reply 29 of 39, by obobskivich

User metadata
Rank l33t
Rank
l33t
NamelessPlayer wrote:
I'm gonna be honest when I say I never would have noticed that. […]
Show full quote
swaaye wrote:

UT2003/2004 are mostly D3D 7. If D3D 8 hardware is present they use pixel shaders on the terrain rendering for a small efficiency gain, I believe.

I'm gonna be honest when I say I never would have noticed that.

To me, the use of pixel shaders really took off with Far Cry, Doom 3 and Half-Life 2, when everything started getting this distinctive plasticky sheen for a bump-mapping shader that completely went away when disabling pixel shaders (if permitted), alongside lens distortion effects, bloom, depth of field, motion blur that was actually blurred and not simply a model leaving translucent trails of itself in its wake, etc. Those were obvious pixel shader effects.

Then there was how Deus Ex: Invisible War wouldn't even run without PS 1.1 support at minimum, which is especially funny considering that it's also UE2 (probably more like UE2.5).

Morrowind uses PS for water (supposedly it's the first), Halo uses them for various effects, and I would guess that GTA Vice City uses them as well (the two later games actually requiring DX9 to be installed, but both listing DX8.1 hardware in their system requirements).

Not sure if this has been posted here before, but it's interesting: http://wikibin.org/articles/list-of-computer- … el-shaders.html (I'm not sure the "without problems" qualifier is exactly fair)

Reply 30 of 39, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Vice City used no pixel shaders at all. It's purely DX7 features at best, though it technically looks more like a DX5 game (its data is mostly 8-bit paletted textures......that get converted to 24/32-bit DXTs on the first run).

Also the fanciest thing UT2004 could do are specific multitexture combine effects, like say a stage with a scrolling depthwriting alpha texture. That and cubemap reflections for water.

Sometimes DirectX requirements are just there to make sure you're up to stuff (to be eligible for customer support). You can run Doom III on a fresh Windows 2000 installation with DirectX 7.0 on a Geforce256, despite its demands for DirectX 9.0b and a Geforce3.

apsosig.png
long live PCem

Reply 31 of 39, by obobskivich

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

Vice City used no pixel shaders at all. It's purely DX7 features at best, though it technically looks more like a DX5 game (its data is mostly 8-bit paletted textures......that get converted to 24/32-bit DXTs on the first run).

How is it doing bloom and other effects then? (Or do those not exist on the PC version (*** I have never seen that game on PC ***)).

EDIT

Did some looking - the effect is indeed not available on PC, and seems to work similar to T-Buffer effects (in that it pulls data from multiple frames in a sequence either in hardware or software (not sure which)) vs using pixel shaders. Learn something new everyday. 😊

Reply 32 of 39, by d1stortion

User metadata
Rank Oldbie
Rank
Oldbie

I'd say shader usage became "a fad" in visual terms when most 7th console gen games stuck to the same techniques, those being overused low quality DoF blur, bloom and color desaturation when having to take cover due to the stupid regenerating health system or in the whole game altogether...

Reply 33 of 39, by leileilol

User metadata
Rank l33t++
Rank
l33t++
obobskivich wrote:

How is it doing bloom and other effects then? (Or do those not exist on the PC version (*** I have never seen that game on PC ***)).

A lot of the 'bloom' are just nicely placed flare sprites with some depth reading. Need For Speed Porsche Unleashed did similar things.

apsosig.png
long live PCem

Reply 34 of 39, by mr_bigmouth_502

User metadata
Rank Oldbie
Rank
Oldbie
m1919 wrote:
cdoublejj wrote:

FX5200 vs 9800? also what about the 6800 LE or is that light years better than both?

The FX5200 is probably one of the worst Nvidia cards ever made.

This. They're so bloody common around here, and I've had almost nothing but bad experiences with them. The problem is that it's hard to build an AGP-based system without resorting to one, as other AGP cards from the same era are rare here. 😜

Reply 35 of 39, by obobskivich

User metadata
Rank l33t
Rank
l33t
leileilol wrote:
obobskivich wrote:

How is it doing bloom and other effects then? (Or do those not exist on the PC version (*** I have never seen that game on PC ***)).

A lot of the 'bloom' are just nicely placed flare sprites with some depth reading. Need For Speed Porsche Unleashed did similar things.

Yeah - what was I specifically referencing was what the game calls "Trails" in its settings (and apparently this doesn't exist on PC version, according to the GTA Wiki), which IME produces an effect where all the in-game light sources end up over-saturated and "shimmery" (if you're familiar with the game, going into the "Club Malibu" or "Club Pole Position" interiors will show this very dramatically; the water surfaces tend to exaggerate it as well) - a lot like the Bloom effect in Oblivion or other early PS2.0 games. Some reading and "Trails" isn't actually even meant to be doing that - it's meant to be some sort of "poor man's motion blur" effect; either way I tend to keep it disabled because I don't like the end result, I was just curious exactly how/what was going on. 😊

distortion: your post made me chuckle. I also think you probably categorized the vast majority of "shooters" made in the last few years (and the really sad part: shooters that DON'T conform to that mold tend to get slammed by reviewers for being "outdated"). 😵

mr_bigmouth: IMO the FX 5200 was fine when it was new, back when you could get them from quality OEMs (like PNY, XFX, or Chaintech) and they conformed to the nVidia specifications (e.g. had a 128-bit memory bus, appropriate cooling, both video outputs, etc), but since they've become one of the few "entry level favorites" for big box retailers (along with the Radeon 7000 and GeForce 8400) I'd agree with you on "bad experiences" when talking about modern examples (it's also probably worth noting that ten+ years ago when they were new, they were already on the "below average borderline mediocre" side of things, and they haven't improved with age).

Reply 36 of 39, by swaaye

User metadata
Rank l33t++
Rank
l33t++

5200 became the ubiquitous cheapo video card for motherboards without onboard video. Those 64bit 5200s that are useless for 3D still make a perfectly adequate GUI and DVD accelerator for XP.

I think the DVI is slightly lower spec though as it has problems at around 1680x1050 IIRC. That's not uncommon for DVI on video cards prior to 2004 or so.

NamelessPlayer wrote:
swaaye wrote:

UT2003/2004 are mostly D3D 7. If D3D 8 hardware is present they use pixel shaders on the terrain rendering for a small efficiency gain, I believe.

I'm gonna be honest when I say I never would have noticed that.

It wasn't a visual improvement. They just managed to make some aspect of the terrain rendering more efficient by using a pixel shader.

We didn't find any use for them in UT2003. Basically all you realistically need vertex shaders for on DX8 cards is to set up pixel shaders. As we're happy with the DX7 blending approach for UT2003 there was no need for neither pixel nor vertex shaders. We do use pixel shaders for terrain rendering but that's just a minor optimization and the DX7 blending fallback is almost as fast and looks 100% identical.

-- Daniel Vogel, Epic Games Inc.

Reply 37 of 39, by leileilol

User metadata
Rank l33t++
Rank
l33t++
obobskivich wrote:

Yeah - what was I specifically referencing was what the game calls "Trails" in its settings

Trails did exist on the PC GTAIII however. I'd imagine DMA would take it out for GTAVC just because how crappy it looked. Rather noticably crappy if your game prefers a locked FPS... Also GTAVC did some cheap HDR effect by shifting the vertex colors of the entire scene if you're looking at the sun or traversing different areas, and sometimes THAT is also confused to be "bloom".

For shaders I did want to do a similar idea to trails in that it takes the higher frames' and combine them all into a bufferswapped 60 or 30fps for a convincing motion blur, but that would require an insane amount of memory...

apsosig.png
long live PCem

Reply 38 of 39, by obobskivich

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

I think the DVI is slightly lower spec though as it has problems at around 1680x1050 IIRC. That's not uncommon for DVI on video cards prior to 2004 or so.

FWIR the original production batches of GeForce FX cards had sub-standard TMDS transmitters ( that weren't discovered for some time) - the 5200 and 5600 being the worst affected. AFAIK the 5800/5900 generally use external chips and have no problems, same for the Quadro FX cards. Usually the problems arise when trying to go >1280x1024 on a long/cheap cable.

Here's an article about it, with measurements:

http://www.extremetech.com/electronics/55254- … ance-shootout/4 (linked to the first page with measurements shown)

If you skim through it, it becomes apparent that it's hard to flatly categorize a single generation - e.g. the Radeon 9500/9700 barely eeks by, but the Ti 4600 with the nice Sil164 chip has zero problems (afaik any card with a 164 (or better) on-board can be considered free of problems, and the refreshed Radeon 9 series (9600/800) as well as the refreshed nVidia (5700/5900) should be considered good to go).

OFC it's worth remembering that just because it "fails" on the bench doesn't mean it is 100% guaranteed to give you issues in the real-world.

leileilol wrote:

Trails did exist on the PC GTAIII however. I'd imagine DMA would take it out for GTAVC just because how crappy it looked. Rather noticably crappy if your game prefers a locked FPS... Also GTAVC did some cheap HDR effect by shifting the vertex colors of the entire scene if you're looking at the sun or traversing different areas, and sometimes THAT is also confused to be "bloom".

For shaders I did want to do a similar idea to trails in that it takes the higher frames' and combine them all into a bufferswapped 60 or 30fps for a convincing motion blur, but that would require an insane amount of memory...

Yeah the "HDR" thing when you stare up at the sun or similar could also be done away with, but that's at least livable imho. Personally I've never really liked motion blur outside of tech demos - every game I've seen it in it tends to just muck things up.

Reply 39 of 39, by cdoublejj

User metadata
Rank Oldbie
Rank
Oldbie
NamelessPlayer wrote:
I'm gonna be honest when I say I never would have noticed that. […]
Show full quote
swaaye wrote:

UT2003/2004 are mostly D3D 7. If D3D 8 hardware is present they use pixel shaders on the terrain rendering for a small efficiency gain, I believe.

I'm gonna be honest when I say I never would have noticed that.

To me, the use of pixel shaders really took off with Far Cry, Doom 3 and Half-Life 2, when everything started getting this distinctive plasticky sheen for a bump-mapping shader that completely went away when disabling pixel shaders (if permitted), alongside lens distortion effects, bloom, depth of field, motion blur that was actually blurred and not simply a model leaving translucent trails of itself in its wake, etc. Those were obvious pixel shader effects.

Then there was how Deus Ex: Invisible War wouldn't even run without PS 1.1 support at minimum, which is especially funny considering that it's also UE2 (probably more like UE2.5).

If you have played BioShock it's a great of example of the "wet" look bump maps have these days, no it's not because you're underwater other games have it too.