VOGONS


Video card for a 98/99 era system…

Topic actions

Reply 20 of 36, by TrashPanda

User metadata
Rank l33t
Rank
l33t

ATI has had pretty terrible DOS compatibility from Radeon onwards, the Pre Radeon stuff does have excellent DOS compatibility for the most part, anything from nVidia before GF6 also has excellent DOS compatibility as does Voodoo Banshee and Voodoo 3.

Reply 21 of 36, by darry

User metadata
Rank l33t++
Rank
l33t++
TrashPanda wrote on 2022-02-26, 23:57:

ATI has had pretty terrible DOS compatibility from Radeon onwards, the Pre Radeon stuff does have excellent DOS compatibility for the most part, anything from nVidia before GF6 also has excellent DOS compatibility as does Voodoo Banshee and Voodoo 3.

I concur . Not sure if all early Radeons have similar issues with VGA modes under DOS, but I don't recall ever getting one to work in Future Crew's Second Reality (on my 9700 and other ones that I do not recall all that clearly). Re: 4:3 and refresh rate on a modern 24" LCD

Reply 22 of 36, by Meatball

User metadata
Rank Oldbie
Rank
Oldbie
TrashPanda wrote on 2022-02-26, 23:57:

ATI has had pretty terrible DOS compatibility from Radeon onwards, the Pre Radeon stuff does have excellent DOS compatibility for the most part, anything from nVidia before GF6 also has excellent DOS compatibility as does Voodoo Banshee and Voodoo 3.

My experience is ATI has lousy DOS compatibility from Rage Pro 3D onward (I don't know about earlier chips). Terminator: Future Shock won't activate 640x480 mode (with SkyNET upgrade) and many DOS games (like Commander Keen) exhibit obnoxious tearing. Gona's list is dead one. Stay away from ATI. I don't bother with anything nVidia for DOS, but the TNT and TNT Ultra I used also had trouble with Terminator. Matrox G400 is also among the cards with DOS tearing and problems with Terminator: Future Shock. Now if anyone doesn't play these games - or perhaps I just have a string of BIOS revisions which don't work properly, or wonky hardware, or I plain just don't know what I'm doing (which I think I do, but I've been wrong before), I will accept that reasoning. Everyone has an experience where one brand is great, and the other is terrible.

For a usually flawless DOS experience while having the ability to run 3D games of the time at a high-level, post-voodoo2 3dfx cards are nearly bulletproof (and I don't have to know what I'm doing... that's the beauty of it all; they just work).

Reply 23 of 36, by TrashPanda

User metadata
Rank l33t
Rank
l33t
Meatball wrote on 2022-02-27, 01:34:
TrashPanda wrote on 2022-02-26, 23:57:

ATI has had pretty terrible DOS compatibility from Radeon onwards, the Pre Radeon stuff does have excellent DOS compatibility for the most part, anything from nVidia before GF6 also has excellent DOS compatibility as does Voodoo Banshee and Voodoo 3.

My experience is ATI has lousy DOS compatibility from Rage Pro 3D onward (I don't know about earlier chips). Terminator: Future Shock won't activate 640x480 mode (with SkyNET upgrade) and many DOS games (like Commander Keen) exhibit obnoxious tearing. Gona's list is dead one. Stay away from ATI. I don't bother with anything nVidia for DOS, but the TNT and TNT Ultra I used also had trouble with Terminator. Matrox G400 is also among the cards with DOS tearing and problems with Terminator: Future Shock. Now if anyone doesn't play these games - or perhaps I just have a string of BIOS revisions which don't work properly, I will accept that reasoning. Everyone has an experience where one brand is great, and the other is terrible.

For a usually flawless DOS experience while having the ability to run 3D games of the time at a high-level, 3dfx cards are nearly bulletproof.

Well .. The Terminator games .. they are the Bethesda ones right . .if so then that there is your problem, they were using a proprietary in house 3D engine (XnGine) from Battlespire which itself is an extension of Daggerfalls engine.

XnGine 3D engine was garbage .. im surprised they used it for so long its only saving grace was that it could be adapted to pretty much any game but it was unstable hot trash.

So they are not a great example here but the Keen Games should be bullet proof, the fact they bug out on ATI cards is troubling, perhaps its the Vesa version ATI was using.

Reply 24 of 36, by Meatball

User metadata
Rank Oldbie
Rank
Oldbie
TrashPanda wrote on 2022-02-27, 01:41:
Meatball wrote on 2022-02-27, 01:34:
TrashPanda wrote on 2022-02-26, 23:57:

ATI has had pretty terrible DOS compatibility from Radeon onwards, the Pre Radeon stuff does have excellent DOS compatibility for the most part, anything from nVidia before GF6 also has excellent DOS compatibility as does Voodoo Banshee and Voodoo 3.

My experience is ATI has lousy DOS compatibility from Rage Pro 3D onward (I don't know about earlier chips). Terminator: Future Shock won't activate 640x480 mode (with SkyNET upgrade) and many DOS games (like Commander Keen) exhibit obnoxious tearing. Gona's list is dead one. Stay away from ATI. I don't bother with anything nVidia for DOS, but the TNT and TNT Ultra I used also had trouble with Terminator. Matrox G400 is also among the cards with DOS tearing and problems with Terminator: Future Shock. Now if anyone doesn't play these games - or perhaps I just have a string of BIOS revisions which don't work properly, I will accept that reasoning. Everyone has an experience where one brand is great, and the other is terrible.

For a usually flawless DOS experience while having the ability to run 3D games of the time at a high-level, 3dfx cards are nearly bulletproof.

Well .. The Terminator games .. they are the Bethesda ones right . .if so then that there is your problem, they were using a proprietary in house 3D engine (XnGine) from Battlespire which itself is an extension of Daggerfalls engine.

XnGine 3D engine was garbage .. im surprised they used it for so long its only saving grace was that it could be adapted to pretty much any game but it was unstable hot trash.

Yes, Bethesda. I didn't know their engine was junk, but I do know it runs fine on my Voodoo 3500, 5500 (the first one I owned), and DOSBox, though.

Reply 25 of 36, by Sunflux

User metadata
Rank Newbie
Rank
Newbie

So, as I continue to stray further and further away from my original goal (LOL), what’s to hate or love about the GEForce FX series? They seem to be available in all price ranges, have AGP 3.3V support, reportedly still have good DOS support, etc.

So… why are they cheap?

Reply 26 of 36, by Cuttoon

User metadata
Rank Oldbie
Rank
Oldbie
Sunflux wrote on 2022-02-27, 16:57:

So… why are they cheap?

Well, generally speaking, they're cheap because they don't fit in PCIe slots, don't support SLI and a contemporary RTX 3090 Ti has overall better 3d performance 😜

The FX 5xxx series generally has a bad rep. Dunno why, but apparently wasn't Nvidias finest hour.

Phil has a bit of an obsession with them:
https://www.youtube.com/watch?v=rc4vivgEriU
Including the Fx5200 which was available in plain old PCI, as one of the last cards ever.

There's also the Gf6 6200, as a PCI card, not even that rare. Zotac was a common brand, around for 20 bucks maybe.

As I understand it, those cards in old systems only make some academic sense if you want to exclude the graphics as a bottleneck or try certain features or benchmarks.

I'd rather go for period correct stuff, but that sureley is a matter of taste.

Attachments

  • zotac.JPG
    Filename
    zotac.JPG
    File size
    100.09 KiB
    Views
    1088 views
    File license
    Fair use/fair dealing exception

I like jumpers.

Reply 28 of 36, by leileilol

User metadata
Rank l33t++
Rank
l33t++
Cuttoon wrote on 2022-02-27, 17:36:

The FX 5xxx series generally has a bad rep. Dunno why, but apparently wasn't Nvidias finest hour.

Pros:
- commonly sharper picture than prior Geforces
- decent TV out that works from boot
- still supports paletted textures
- highest grade of obtanium (even for its day)
- the "FX" reminds you of some company,....what did TDFX do again?
- If you're upgrading your dude-driven dell dimension, it's an acceptable leap up from an Intel Extreme Graphics, i guess
Cons:
- LOUD OG high-end models
- slow pixel shader 2 (for the latest games in '03+ you'd want the Radeon 9500+)
- debuted with a horrible authoritive slogan
- nvidia cheated 3dmark03 benchmarks
- 64-bit variants aren't always clear
- later w9x drivers seem sabotaged (6x.xx)
- extremely big GL extension list crash some older OpenGL games and apps, despite allegedly being the API darling
- still has the same ugly nvidia dithering table
- still has the same blocky texture filtering
- still has paletted ddraw issues (slowdowns and some w9x fatals)
- still absolutely hates the presence of a PowerVR PCX2

As Geforce suggestions are thrown about in the face of era appropriateness, the RAMDAC requirement won't be satisfied as those earlier Geforces were quite blurry!

Matrox cards in DOS would stutter in Keen4 (not as much as other cards *cough*PowerVR*cough*) and would have some palette corruption issues in some SVGA games.

apsosig.png
long live PCem

Reply 29 of 36, by Sunflux

User metadata
Rank Newbie
Rank
Newbie

Thanks. I remember “back in the day” I always felt I got a much better quality picture from ATI or Matrox cards than pretty much anything else, especially since I typically ran Windows at high refresh rates (90-120hz depending on resolution and card capabilities).

I think I’m just going to grab a FX with DVI. I don’t think the 1-2 year difference between that and the other acceptable options is anything to worry about. And if I hate not having a GPU bottleneck, I’ll look for something more 1998-1999ish. 😀

Reply 30 of 36, by Dolenc

User metadata
Rank Member
Rank
Member

I have a fx 5900xt, that was kinda ment as a backup plan if voodoo 5 sucked.

Been using the v5 for a couple of months, so got quite used to it.

Time came to try out the fx one. Disclamer, was a short test and I used "newer" win98 drivers, def not 5x, forgot what version.
Plugged into dvi.

First I started quake 3 and fuck. Image was just crap, blurry texture filtering, reds dont pop out like they did on the voodoo, yea you can use 32bit colors, aa, higher resolution and that will fix the image (still no reds, like on the v5 with increased lod).

Changed the resolution to 1920x1080, screen went blank, wtf... Ok tried a couple more, works, try 1280x720 blank..

I may have been a bit a bit too harsh on it and would need to try again with older drivers, but the gpu made such a bad first impression, that it nov sits in the box.

Reply 31 of 36, by shamino

User metadata
Rank l33t
Rank
l33t

FX cards will require later drivers, but I don't know if there's a specific reason that would matter. Sometimes people want the option of older drivers because of possible rendering bugs getting introduced to old games after some version.
I know with K6 machines the older drivers are a lot faster, but for a Katmai, maybe there isn't the same performance regression since nVidia would have still considered those CPUs relevant for a lot longer, and so they probably retained code optimizations for them.

People are afraid of FX5200/FX5500 cards because so many of them are crippled with 64-bit memory. If you find one that you're confident is 128 bit then you might get a good price on it.
Beyond that, the whole FX generation got a bad reputation because of being terrible in DirectX 9, which you probably don't care about. The older cards don't support it at all so it's not a loss. If you look at FX as DirectX 8 cards then they can be appealing.

Geforce2 MX cards are common and can use very old drivers, or there's the more powerful (maybe expensive) Geforce2 or Geforce3 options which also can use old drivers. But as above I don't know if that really matters.
A Quadro FX500 is essentially a 128-bit FX5500, in case you want to go that route. I assume it probably has good image quality, I have one but it's been years since I used it so I don't really remember.

Reply 32 of 36, by TrashPanda

User metadata
Rank l33t
Rank
l33t
shamino wrote on 2022-03-03, 02:42:
FX cards will require later drivers, but I don't know if there's a specific reason that would matter. Sometimes people want the […]
Show full quote

FX cards will require later drivers, but I don't know if there's a specific reason that would matter. Sometimes people want the option of older drivers because of possible rendering bugs getting introduced to old games after some version.
I know with K6 machines the older drivers are a lot faster, but for a Katmai, maybe there isn't the same performance regression since nVidia would have still considered those CPUs relevant for a lot longer, and so they probably retained code optimizations for them.

People are afraid of FX5200/FX5500 cards because so many of them are crippled with 64-bit memory. If you find one that you're confident is 128 bit then you might get a good price on it.
Beyond that, the whole FX generation got a bad reputation because of being terrible in DirectX 9, which you probably don't care about. The older cards don't support it at all so it's not a loss. If you look at FX as DirectX 8 cards then they can be appealing.

Geforce2 MX cards are common and can use very old drivers, or there's the more powerful (maybe expensive) Geforce2 or Geforce3 options which also can use old drivers. But as above I don't know if that really matters.
A Quadro FX500 is essentially a 128-bit FX5500, in case you want to go that route. I assume it probably has good image quality, I have one but it's been years since I used it so I don't really remember.

I actually like the FX5600 ultra/5700 ultra cards, they usually have 128bit bus and dont generally cost a fortune and being the middle of the road they are not overpowered for retro rigs nor are they lacking in power either which removes the possibility of being GPU limited and just lets the CPU flex itself with some GPU breathing room.

Ive even used the FX5200 Ultra which is also a great little card.

Reply 33 of 36, by Sunflux

User metadata
Rank Newbie
Rank
Newbie

Unfortunately I’m finding that adding the word “Ultra” to a Geforce card, regardless of its true performance, generally automatically doubles or triples the price. Just like adding “3DFX” to anything also seems to do the same…

Reply 34 of 36, by TrashPanda

User metadata
Rank l33t
Rank
l33t
Sunflux wrote on 2022-03-03, 16:50:

Unfortunately I’m finding that adding the word “Ultra” to a Geforce card, regardless of its true performance, generally automatically doubles or triples the price. Just like adding “3DFX” to anything also seems to do the same…

I picked the Ultra version of them two in particular as the normal versions are rather .. underwhelming and you are better off with a 5800 or 5900 if you cant find the Ultra 5600/5700.

Reply 35 of 36, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Normal 5600/5700 are affordable and have decent performance with some overclocking. Ultra variations are not. But you also can buy Quadro FX1100, which is something in-between normal FX 5700 and 5700 Ultra.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 36 of 36, by TrashPanda

User metadata
Rank l33t
Rank
l33t
The Serpent Rider wrote on 2022-03-04, 12:27:

Normal 5600/5700 are affordable and have decent performance with some overclocking. Ultra variations are not. But you also can buy Quadro FX1100, which is something in-between normal FX 5700 and 5700 Ultra.

Just bought a 5700 ultra for cheap, the Quadro FX 1100 is some ~400 USD tho so perhaps going the Quadro route first is not advisable (The average price was pretty high for the Quadro's, there were a couple of untested ones for 250 -300 USD).

By cheap I mean cheap for an Aussie importing from the US, was ~90 USD for the card and then you add import Duties and conversion rates and GST ... and that 90 dollar card quickly becomes more expensive but still cheaper than I have paid for some other GPUs recently.