VOGONS


First post, by Wester547

User metadata
Rank Newbie
Rank
Newbie

Yeah I know… of all the cards to use for an old build.

I recently acquired a used, stock-clocked Asus V9520 (passively cooled with Hynix RAM and SMD capacitors) which doesn’t seem to be performing as fast as the reference card does (according to the benchmarks available at HW-Museum.cz). I’ve used various drivers on Windows XP SP3 with little difference. Its results are somewhat lower in 3DMark2001SE’s feature tests (notably 20% slower in the DOT3 Bump Mapping test and 90% slower in the Point Sprites test) and less dramatic in games, but I noticed a couple other issues. One, a small, strange blinking bar at the top right of the screen in a number of OpenGL games with no frame rate limit and at very high frame rates - using V-Sync or manually limiting the frame rate does away with the issue. Also, Quake III engine based games have a problem where I lose control of the mouse and the camera if the frame rate is high enough during game play and if the frame rate limit is above 1000, but that may be unrelated (if the Microsoft Optical Mouse has anything to do with it). Direct3D games seem to be okay but slower than expected.

The most astounding problem (which is probably not unique to my card) is the massive slowdown (at 800x600, no less) in Quake III Arena’s introductory level, no doubt because of the portal and mirror. But as I approach the portal, it drops all the way down to 20FPS! A GeForce 4 MX 440 I have in another old machine is at least three times as fast in that area, at the same settings. What gives? Is it the lack of Z-Compression? I don’t recall that scene ever being so slow whilst playing it on a GeForce 2 GTS or even a GeForce 2 MX/MX 400. The Video RAM passed every test I could put it through and there doesn’t seem to be any true instability or rendering artifacts (besides the blinking bar which only shows up and is barely perceptible in OpenGL games at high frame rates). The heatsink on the card becomes notably hot when stressed (as it’s passively cooled). The system RAM passed stress tests as well, and the hard drive passed S.M.A.R.T. with no bad sectors.

The rest of the specifications are: A Pentium 4 2.66 GHz Northwood, 1GB of PC2700 DDR SDRAM (two 512MB DIMMs, Kingston and Transcend), 100GB PATA Seagate 7200.7 HDD, A Pioneer DVR-115D and DVR-104 optical drive, a NEC floppy drive, a D845PEBT2 motherboard, Sound Blaster Audigy Gamer, VIA USB 2.0 card for extra USB ports, Dell P792 CRT, NPS-250KB power supply with Japanese capacitors (the motherboard also has Japanese capacitors), etc… I’m currently using ForceWare 93.71 drivers, two 92mm case fans, and the 70mm CPU fan which runs at full speed. The previous video cards in this computer were: a GeForce 4 MX 420, Radeon 9500 Pro (dead), GeForce 2 GTS, and a GeForce 2 MX. I haven’t tried the FX 5200 in another system yet. Updating the chipset drivers didn’t seem to make a difference.

Is there any chance the video card is not functioning correctly? Could it be unoptimized VRAM timings? Are Quake III’s portals supposed to be that much of a frame rate killer (on some cards but not others)? Thanks and sorry for all the questions.

Reply 1 of 11, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
Wester547 wrote on 2024-12-25, 20:12:

Are Quake III’s portals supposed to be that much of a frame rate killer (on some cards but not others)?

I've seen those Quake 3 portals and mirrors tank the frame rate of a GeForce FX 5900XT as well, especially when pushing the resolution to something like 1600x1200 while having all in-game settings at maximum.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 2 of 11, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Q3 portals causing dips is fine, it's more of a CPU issue. There's no instancing of geometry in Quake3 so it's rendering another scene with a different PVS and cutting around it. Not even tile rendering will save you here. If your texture cache is small (i.e. Voodoo2, especially two of them) then it'll thrash really hard from that.

apsosig.png
long live PCem

Reply 3 of 11, by swaaye

User metadata
Rank l33t++
Rank
l33t++

A drop to 20 fps sounds like vsync being annoying. With driver 93.76 you can turn on OpenGL triple buffering.

It might also be a good idea to try everyone's favorite driver, 45.23, and see what happens.

FX 5200 is a pretty weak chip though. I think the MX 440 might be faster in some cases.

Reply 4 of 11, by Wester547

User metadata
Rank Newbie
Rank
Newbie

V-Sync doesn’t seem to work in Quake III for me, but even if it’s disabled in the control panel and in the game, the results are the same. I tried triple buffering and it made no difference, although I haven’t used the 93.76 drivers. And it turns out the MX440 in the other machine is actually over twice as fast in that scene, not three times faster. I did try the 45.23 drivers, and I was wrong, those drivers actually do result in a significant speed increase in the portal sequence, but if I recall correctly, NVIDIA was known to cheat with those drivers, particularly in benchmarks such as 3DMark03 (or even 2001SE). I was also able to reproduce the FPS cap issue on another computer so that seems to be a problem with the Quake III engine.

Could lower results in 3DMark be a CPU issue? The feature tests (except vertex shader and high polygon count) should be GPU bound for the most part. I think there may be a problem with the card (the core or the VRAM?) because of the rapidly flashing lines in OpenGL game menus with no frame rate limit. And I hate to say it but I did notice some lightly scratched traces on the back of the card, although the copper itself hasn’t been exposed…

Reply 5 of 11, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Sorry, I meant 93.71. 45.23 is liked around here because drivers 5x.xx and later break some old games. The cheating was real, but I think it was mostly 3DMark03 and some Shader Model 2 games where they replaced shader code with hand tuned lower precision code that looked worse. You don't want to run any Shader Model 2 anything on a FX series card anyway. There are texture filtering "optimizations" too but this is hard to identify and it plagues all of their cards until they release 8800GTX. With some drivers you can set texture filtering to "High Quality" for best quality and a speed hit.

All of the 3DMarks are influenced by CPU, especially with a relatively fast GPU.

Usually a defective card will have more pronounced corruption of the rendered game image or will freeze, crash the game, driver or BSOD the system. Damaged memory often shows up as funky corrupted patterns in everything including boot and GUI.

To enable vsync with Quake 3 based games you need to edit the config file and set r_swapinterval to 1.

Reply 6 of 11, by Wester547

User metadata
Rank Newbie
Rank
Newbie

The r_swapinterval CVAR worked, but with V-Sync enabled the frame rate predictably only drops lower. I’ve tried the high performance texture filtering settings in the control panel, and only noticed moderate performance gains. I’m wondering if the flashing lines are glitches caused by a lack of V-sync (as they disappear once V-sync is enabled or the frame rate is otherwise limited), but I’ve never seen them anywhere else.

I’ll give an example of what I mean (it probably seems like I’m splitting hairs at this point). The 128-bit GeForce FX 5200 is supposed to score 11.9M/Sprites (or thereabouts) in the Point Sprites test in 3DMark2001SE going by the online scores. Mine only scores 6.3Msprites/s (or close to that, with the earliest drivers or the latest drivers). The MX440 is coupled with a slower CPU (1.7 GHz Pentium 4) and it gets 9.9Msprites/s. A GeForce 2 MX gets 5.6Msprites/s.

A Radeon 9500 Pro can get 29.1Msprites/s with the same CPU. So I can’t imagine what the bottleneck would be with a 2.66GHz Pentium 4 and PC2700 DDR SDRAM (the Pentium 4 has dual channel PC800 RDRAM). Or why other tests are mysteriously slower. There also seems to be a “stuttering” issue where it randomly speeds up for a second in the 3DMark tests. I know 3DMark can’t be used as a true indicator of performance as it’s ultimately CPU limited and probably sensitive to other factors.

Still, maybe the card is just a dud, as far as performance goes, even for a FX 5200?

Reply 7 of 11, by pentiumspeed

User metadata
Rank l33t
Rank
l33t

Keep in mind

FX5200 does not have vertex pipelines. And very cut down features and clocking is all over the map than the reference clockings. Often I see low memory clocking due to slower vram in nanoseconds.

Grforce4 4200 does. same with Geforce3 and they preform better. Geforce2 does not have vertex pipelines either but they perform better. MX400 and MX 440 are also better but no vertex pipeline.

Cheers,

Great Northern aka Canada.

Reply 8 of 11, by Wester547

User metadata
Rank Newbie
Rank
Newbie

No vertex pipelines? I thought the FX 5200 had a single hardware vertex shader which ran in an “array” and could effectively perform two vertex operations at a time, and that hardware T&L was emulated with the vertex shader and another T&L pipeline as one vertex shader was not enough to sufficiently perform. Or did I miss something? I thought lower VRAM timings could possibly be responsible for the lesser performance.

Reply 9 of 11, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Regular FX5200 performs about as fast as GeForce 4 MX 440, so it's not quite fast for heavy Quake 3 scenes (portals, heavy fillrate rocket trails and gibs).

I must be some kind of standard: the anonymous gangbanger of the 21st century.