VOGONS


First post, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t

When building a Win9x legacy system, I think the most important consideration is video card compatibility. 3dfx aside, I think compatibility issues can be broken down to three major things:
(a) Windows 98 compatibility
(b) FSAA compatibility for old games (namely alpha textures)
(c) 8-bit palleted textures support

As far as O/S compatibilty goes, I believe the best card should be GeForce 6 series, because it is the latest GeForce card that supports Windows 98.

When it goes to FSAA, I think ATI is not a compelling choice, because Radeons does not support SSAA --it only supports MSAA. Thus, enabling Anti-Aliasing has no effect in older games that use alpha textures. An example I'm aware of is European Air War where enabling FSAA has no effect at all. This should be considered by those who love to enable FSAA on older Direct3D games like Incoming or Hellbender. I'm not sure whether those two games I mentioned above use alpha textures, but the point is, many old games do, so Radeons won't do any good if one is to enable FSAA on older games.

8 bit palleted textures adds more thing to the equation. Newer video cards does not support 8-bit palleted textures anymore, so certain old games won't run. If we take 8-bit palleted textures into account, then GeForce 6 is out, because the latest GeForce that supports such thing is GeForce 5 series.

I have several questions though:

(1) Am I being correct with the things I mentioned above? For instance, is there actually a good Radeon card that supports SSAA? If Radeon X800 actually supports Win98 and SSAA, then the video card can be a good choice to enable FSAA on older Direct3D games. Imagine enabling 8xAA on old Direct3D games like European Air War.

(2) Are there still other factors that I overlooked besides alpha textures and 8-bit palleted textures?

(3) How many games actually have alpha textures? Is there a list of such games?

(4) Likewise, how many games have 8-bit textures?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 1 of 20, by leileilol

User metadata
Rank l33t++
Rank
l33t++
Kreshna Aryaguna Nurzaman wrote:

(3) How many games actually have alpha textures? Is there a list of such games?

Almost every game.

Personally if you're after Legacy support and FSAA i'd go with the Geforce2 GTS 64mb.

apsosig.png
long live PCem

Reply 2 of 20, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
leileilol wrote:
Kreshna Aryaguna Nurzaman wrote:

(3) How many games actually have alpha textures? Is there a list of such games?

Almost every game.

Personally if you're after Legacy support and FSAA i'd go with the Geforce2 GTS 64mb.

Er, but why not GeForce 5 series? If its legacy support is as good as GeForce 2, I think it is a good choice because it is faster.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 3 of 20, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Yeah I've had good luck with GeForce FX and old games. It has palleted texture support, good multisampled anti-aliasing and texture filtering. They do have supersampling support, but it will really drag down performance. GF FX is much better than GeForce2 in every way, really. Even the rather crappy FX 5200 would rival GF2 Ultra, I believe. The OpenGL driver is quite good for these cards, as is usual for NV. I have a FX5600 and a FX5950 Ultra, the latter is the fastest card that will work in an overclocked AGP 2x-only motherboard. ATI's 9800 XT will fit the slot but won't handle high AGP clocks well at all.

The FX series is awful if you need to run games that use DX9 shader model 2, however. Very slow and usually ugly due to driver hacks to try to make them competitive.

BTW, for alpha textures, the Radeon (post-9500) and GeForce cards (post GF6) support transparent texture AA. It only affects alpha textures and is used in concert with regular full-scene multisample AA. Transparency AA actually has two modes on both cards. Multisample mode somehow uses MSAA for these alpha textures and smooths them better than nothing, but has very little performance hit (if any). Supersample tranparency AA, on the other hand, offers better quality at the expense of a good bit of speed. However, transparency AA is definitely faster and offers a crisper overall image than regular full screen supersampling.

A look at the results of TAA on both Radeon and GeForce.
http://www.bit-tech.net/hardware/2006/04/07/t … _aa_explained/3

Alpha textures are used for objects that would be prohibitively expensive to model in 3D, for whatever reason. Probably every 3D game made uses them.

As for palleted texture support, that's a bit more murky. It apparently is a plus for Voodoo emulation, according to what I've read at zeckensack's page. But, I've never noticed a difference with or without it for this, honestly. System Shock 2 apparently uses 8-bit palletized textures, but the game runs really great and looks awesome on Radeon 9700, which has no support for those. ATI does force 16-bit to actually render 32-bit, I believe, as I think it is necessary for their AA to function. (GeForce 8, btw, looks horrible on games with only 16-bit support; seems to lack any dithering whatsoever). Final Fantasy 7 PC uses palletized textures, I believe, but I've never played that.

Check out what 8800GTX does to Quake3 and System Shock 2 at 16-bit color depth. Shock 2 only supports 16-bit color. Quake 3 supports 32-bit everything but looked so terrible at 16-bit color that I had to capture a shot. 😀 I think this is worse than even the legendary Rage 128's 16-bit color dithering. I've never seen anything this bad and GeForce7 does not do it. Apparently 16-bit depth got little thought from the driver/hardware engineers at NV. Can't blame them I guess heh.
quake3%202007-03-18%2017-32-44-90sm.jpg
SHOCK2%202007-03-16%2001-28-36-04sm.jpg

Reply 4 of 20, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
swaaye wrote:

Yeah I've had good luck with GeForce FX and old games. It has palleted texture support, good multisampled anti-aliasing and texture filtering. They do have supersampling support, but it will really drag down performance.

Great! As for SSAA performance penalty, I don't really mind so much, because my goal is actually enabling AA on early Direct3D games like Incoming or Helicops; I wonder if there is a card that does it better than my current Voodoo 5500.

swaaye wrote:

The FX series is awful if you need to run games that use DX9 shader model 2, however. Very slow and usually ugly due to driver hacks to try to make them competitive.

No prob, since this is for old, Win9x games. 😀

swaaye wrote:

As for palleted texture support, that's a bit more murky. It apparently is a plus for Voodoo emulation, according to what I've read at zeckensack's page.

Yup, I read the zeckensack's page too, although I don't quite understand why it is a plus for Voodoo emulation. IIRC lack of 8-bit palleted textures results in some strange artifacts around the object.

swaaye wrote:

But, I've never noticed a difference with or without it for this, honestly. System Shock 2 apparently uses 8-bit palletized textures, but the game runs really great and looks awesome on Radeon 9700, which has no support for those.

I see. For Win 9x games, I think it is a toss-up between GeForce FX and GeForce 6. Both support Win 98, but GeForce FX supports 8-bit palleted textures while GeForce 6 does not (of course GeForce 6 is faster).

But I'm actually surprised that SS2 runs on Raden 9700, because apparently games with 8-bit palleted textures cannot run (or do not run well) on unsupporting video cards.

How many games actually have 8-bit palleted textures? Is there a list and such? And why games like SS2 run, while some others don't?

Last edited by Kreshna Aryaguna Nurzaman on 2007-10-03, 02:02. Edited 1 time in total.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 5 of 20, by leileilol

User metadata
Rank l33t++
Rank
l33t++

i've always had bad luck with the GeforceFX (especially 5200. Very slow card, runs Q3A half the speed of the Geforce2. The "shader games" are not the only thing affected by this card's abysmal performance. It's overall bad)

in particular, the fx drivers don't have the same compatibility with legacy video modes as older Geforce2 drivers

apsosig.png
long live PCem

Reply 6 of 20, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

i've always had bad luck with the GeforceFX (especially 5200. Very slow card, runs Q3A half the speed of the Geforce2. The "shader games" are not the only thing affected by this card's abysmal performance. It's overall bad)

Well it is not very surprising though, since GeForce 5200's fill rate is only half of that of GeForce 2 Ti (1000 MT/s compared to 2000 MT/s), but GeForce 5900 or 5950 should run Q3A faster than GeForce 2, though.

leileilol wrote:

in particular, the fx drivers don't have the same compatibility with legacy video modes as older Geforce2 drivers

You mean DOS video modes?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 7 of 20, by leileilol

User metadata
Rank l33t++
Rank
l33t++
Kreshna Aryaguna Nurzaman wrote:

You mean DOS video modes?

Sure, if you want to call them that. It's not very fun to have a fatal black screen freeze after a text mode Blue Screen of Death(R), or attempting to fullscreen a command prompt. Later detonators/forceware did that.

apsosig.png
long live PCem

Reply 8 of 20, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
leileilol wrote:
Kreshna Aryaguna Nurzaman wrote:

You mean DOS video modes?

Sure, if you want to call them that. It's not very fun to have a fatal black screen freeze after a text mode Blue Screen of Death(R), or attempting to fullscreen a command prompt. Later detonators/forceware did that.

Ah, but such thing shouldn't matter when building a Win9x legacy system, right? I mean, if your'e only concerned with early Direct3D games like Hellbender, legacy video modes shouldn't pose any problem.

How about 8-bit palleted textures, by the way? How do you tell which games have 8-bit palleted textures? If such games are rare, then one can just go with GeForce 6 for Win9x system.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 9 of 20, by leileilol

User metadata
Rank l33t++
Rank
l33t++
Kreshna Aryaguna Nurzaman wrote:

Ah, but such thing shouldn't matter when building a Win9x legacy system, right?

It damn well matters. A BSOD will at least strike you five times having to do hard resets every time with a newer driver. Even worse is suddenly launching full screen DOS apps that leave the screen permanently black.

Kreshna Aryaguna Nurzaman wrote:

How about 8-bit palleted textures, by the way? How do you tell which games have 8-bit palleted textures?

Rediculous question. Common sense plays a factor in this, i.e. anything made before 2000 during 3dfx's reign of power. You should not have an issue with 8bit texture support if you stick to older drivers, as i've constantly suggested (along with the Geforce2, to be able to use even older more compatible and stable 9x drivers like detonator 12.41)

apsosig.png
long live PCem

Reply 10 of 20, by swaaye

User metadata
Rank l33t++
Rank
l33t++

A FX 5200 is definitely faster than a GeForce2. And I bet it wouldn't be as blurry as some of those GeForce 256 and GeForce2 cards tended to be. Yuck. Do any of those cards still have working fans? (lol i really do wonder!)

It beats a Radeon 9000 here. Until shaders are needed, but we don't care about that hehe. I believe I've read that 5200 is a 4x1 pipe card until you ask it to use pixel shaders, and then it becomes a 2x1 basically. Those FX chips were strange beasts.
http://techreport.com/articles.x/5065/8

I can't attest to their DOS mode support though. But I wouldn't say they are inherently unstable. They do work fine for lots of Windows games, even older stuff. I use both a FX 5600 and FX 5950 for some of that. They may be trouble on a Super 7 board. Who knows. Most AGP cards had problems on Super 7 boards.

If you want to run a Super 7 board, I highly suggest you just get a Voodoo5 5500 AGP. Those have been very successful for me on those boards. Other cards will do strange things, running unstable and having random issues. And you then get native Glide support, which we all know is really great for games from 1998-2001 or so.

I would stay away from both Radeon and GeForce cards with Super 7 boards.

Reply 11 of 20, by dh4rm4

User metadata
Rank Oldbie
Rank
Oldbie

I ran an FX 5900 XT for ages across three different motherboards and many older and 2004/2005 titles (including PS2.0 laiden Source-based games like HL2 and Counter Strike : Source) quite happily until I went to a 6600GT. For the most part the 5900 ran like a dream and I still feel it's video playback to be equal to the 7800GT (which I purchased after the 6600GT) even for HD stuff (WMVHD and Quicktime 7).

Some FX 5200's, particularly those of the crap variety, could be slower than a GF4Ti for certain OGL and D3D titles due to pathetic RAMDAC and slow-as-hell cheapo RAM. This would invariably make them slower than some of the higher end GF2's.

I agree with Super 7 comments - AGP is hack for those boards and utilising a card that uses no DIME or Sideband Addressing really is a must.

Reply 12 of 20, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Source runs in DX8 (PS1.4) mode on FX series cards. 😀 You can force it to do otherwise, but it will run a lot slower.

Yeah stay far away from FX 5200 cards with slow RAM. Kinda tough to figure that out without a part number tho. If it has 256MB RAM (the Ultras are always 128mb), that's usually a bad sign. In those days the manufacturers would dump cheap, slow high density RAM chips on boards because it was cheaper than lower density but much faster RAM. And we all know a lot of people buy based on RAM size.

Well, they still do that now days too.

Reply 13 of 20, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
dh4rm4 wrote:

I ran an FX 5900 XT for ages across three different motherboards and many older and 2004/2005 titles (including PS2.0 laiden Source-based games like HL2 and Counter Strike : Source) quite happily until I went to a 6600GT.

What happened after you switched to GeForce 6?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 14 of 20, by dh4rm4

User metadata
Rank Oldbie
Rank
Oldbie

Some things (well most) were faster. Some video playback improved especially HD but SD sometimes was worse.

Re : Source and PS 1.4 -shadermode (or whatever the switch was) - I know....it's funny though in some ways PS 2.0 of FX 5900 seemed to be of higher precision (though slower in many cases) than that of the 6600GT.

Reply 15 of 20, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t

Duh! There's something I overlooked while reading this thread.

swaaye wrote:

System Shock 2 apparently uses 8-bit palletized textures, but the game runs really great and looks awesome on Radeon 9700, which has no support for those.

So am I correct to say only certain games with 8-bit palleted textures fail to run on non-supporting video cards? And why? And is there any known list of such games?

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 16 of 20, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Apparently Matrox Mystique and S3 Savage cards also support palletized textures. I did some Googling. Final Fantasy VII and VIII use palletized textures.

The only way to really know if there are going to be any issues is to google specific games. Look on Google groups. I think palletized texturing support will be a rare problem though. I've never run into it personally. Just have a card that supports them and then there's no problem at all.

Reply 17 of 20, by leileilol

User metadata
Rank l33t++
Rank
l33t++

System Shock 2 internally loads them as 16-bit textures.

apsosig.png
long live PCem

Reply 18 of 20, by Kreshna Aryaguna Nurzaman

User metadata
Rank l33t
Rank
l33t
leileilol wrote:

System Shock 2 internally loads them as 16-bit textures.

Then System Shock 2 is an exception, while generally all games with 8-bit palleted textures just don't run at all on non-supporting cards?

swayee wrote:

Just have a card that supports them and then there's no problem at all.

Then I guess the safest card for Win9x legacy system (old Direct3D games) is either GeForce FX or GeForce 4 series, while GeForce 6 is out of question due to lack of 8-bit palleted textures support.

Never thought this thread would be that long, but now, for something different.....
Kreshna Aryaguna Nurzaman.

Reply 19 of 20, by Davros

User metadata
Rank l33t
Rank
l33t

while later gf cards lack hardware support for palleted textures they do have software support for them same as ati cards
i do beleive they will run
if someone can post a list of a few games to test i wil test them

ps: you know the prob with avp on gf cards its been fixed it was down to a z buffer problem not palleted textures

pps: just tried q3 with 16bit textures and while the sky has banding all the other textures are fine

guess what frame rate im getting ?
1000fps - woot