VOGONS


Table Fog & 8-bit Paletted Textures

Topic actions

First post, by rob8086

User metadata
Rank Newbie
Rank
Newbie

What are they and why do they matter?

I gather that these technologies are relied upon by older titles (specifically DirectX v3-5, pre-1998) and I understand Radeons and GeForce 6+ do not support them. Is that to say Radeons do not support DirectX/D3D v3-5? Can anyone tell me what examples of these technologies might be, what games and effects might rely on them, etc? I've got a Radeon machine and am building a GeForce machine, and I'd like to compare performance between the two using any relevant titles. The only one I know to test right now is Shadows of the Empire.

Sorry if this is a dumb question. If it's already been asked (I searched first, I promise!), kindly point me in the right direction. Thanks, all!

Edit: So if my understanding is correct, table fog is the method used for rendering fog in a plethora of older titles, dating all the way up to at least 2000's Thief II.

/EDIT

List of game with Table Fog Support
List of games with 8-bit paletted texture support

Last edited by DosFreak on 2022-12-03, 13:43. Edited 5 times in total.

Corruptor : ASUS TUV4X - Intel SL6BY @ 1.4GHz - 2x 256MB PC133 - ATI Radeon 8500 - Creative Sound Blaster Audigy2 ZS SB0350
Aggressor : ASUS TUSL2 - Intel SL6BY @ 1.4GHz - 2x 256MB PC133 - NVIDIA GeForce 4 Ti 4600 - Creative Sound Blaster Audigy2 ZS SB0350

Reply 1 of 553, by firage

User metadata
Rank Oldbie
Rank
Oldbie

They were 3dfx hardware enabled features. Even the contemporary NVIDIA gear predating the GeForce 256 lacked paletted textures and only supported table fog through emulation.

Table fog makes a visual difference in stuff as late as Thief II (2000).

Final Fantasy VII (1998) refused to enable D3D acceleration if paletted textures weren't supported, but the requirement was soon patched out. I believe it mostly amounts to a performance feature.

My big-red-switch 486

Reply 2 of 553, by rob8086

User metadata
Rank Newbie
Rank
Newbie

Thank you for that response. It led to me searching for things a little differently and I stumbled upon a partial list of titles where table fog (which I now understand to be the mechanism by which fog is rendered - duh) is actually a really big deal. The list I found (so far):

* Centipede
* Rainbow Six: Rogue Spear
* Test Drive 5 (patch 1.1)
* Thief II: The Metal Age
* Shadows of the Empire

I'm still not 100% clear on what paletted or palettized textures actually are... though it is definitely helpful to know of at least one game that requires them for hardware acceleration.

Corruptor : ASUS TUV4X - Intel SL6BY @ 1.4GHz - 2x 256MB PC133 - ATI Radeon 8500 - Creative Sound Blaster Audigy2 ZS SB0350
Aggressor : ASUS TUSL2 - Intel SL6BY @ 1.4GHz - 2x 256MB PC133 - NVIDIA GeForce 4 Ti 4600 - Creative Sound Blaster Audigy2 ZS SB0350

Reply 3 of 553, by The Serpent Rider

User metadata
Rank l33t++
Rank
l33t++

Palleted textures are common practice to save memory with little loss in color quality, mostly used on early 3D game consoles up to PS2. And 3dfx have arcade hardware roots.

Last edited by The Serpent Rider on 2018-03-08, 02:21. Edited 1 time in total.

I must be some kind of standard: the anonymous gangbanger of the 21st century.

Reply 4 of 553, by shiva2004

User metadata
Rank Member
Rank
Member
rob8086 wrote:
Thank you for that response. It led to me searching for things a little differently and I stumbled upon a partial list of titles […]
Show full quote

Thank you for that response. It led to me searching for things a little differently and I stumbled upon a partial list of titles where table fog (which I now understand to be the mechanism by which fog is rendered - duh) is actually a really big deal. The list I found (so far):

* Centipede
* Rainbow Six: Rogue Spear
* Test Drive 5 (patch 1.1)
* Thief II: The Metal Age
* Shadows of the Empire

I'm still not 100% clear on what paletted or palettized textures actually are... though it is definitely helpful to know of at least one game that requires them for hardware acceleration.

A paletted texture is a texture that uses 256 colours (or even less, the famous model 2 arcades could use even 2 colour textures) selected from a total of 64K (16 bits) or 16M (32 bits) colours.
They're used to save space and in their time they were also faster to render; even the PS2 uses a lot of paletted textures, and sometimes the PC ports of these games also uses them, causing graphical glitches in modern hardware.

Reply 5 of 553, by rob8086

User metadata
Rank Newbie
Rank
Newbie

Awesome! Thank you, appreciate the mini-lesson.

So it sounds like it's mostly ports that would require paletted texture support. Considering how many titles that could cover (Mortal Kombat, GTA, Tony Hawk, FF off the top of my head), that seems pretty important!

Corruptor : ASUS TUV4X - Intel SL6BY @ 1.4GHz - 2x 256MB PC133 - ATI Radeon 8500 - Creative Sound Blaster Audigy2 ZS SB0350
Aggressor : ASUS TUSL2 - Intel SL6BY @ 1.4GHz - 2x 256MB PC133 - NVIDIA GeForce 4 Ti 4600 - Creative Sound Blaster Audigy2 ZS SB0350

Reply 6 of 553, by F2bnp

User metadata
Rank l33t
Rank
l33t

I think there was a similar thread somewhat recently in which people tried to find games that required paletted texture support. Aside from FF VII and VIII, it was only really a couple of other games that had issues, so you're probably not missing much. Table fog support however was more widespread and it can lead to a lot of games looking very bland without it. Shadows of the Empire is the classic example that comes to mind.

Reply 7 of 553, by silikone

User metadata
Rank Member
Rank
Member
firage wrote:

They were 3dfx hardware enabled features. Even the contemporary NVIDIA gear predating the GeForce 256 lacked paletted textures and only supported table fog through emulation.

Table fog makes a visual difference in stuff as late as Thief II (2000).

Final Fantasy VII (1998) refused to enable D3D acceleration if paletted textures weren't supported, but the requirement was soon patched out. I believe it mostly amounts to a performance feature.

Is fog emulation anything more than an additional pass over the framebuffer?

Do not refrain from refusing to stop hindering yourself from the opposite of watching nothing other than that which is by no means porn.

Reply 8 of 553, by DosDaddy

User metadata
Rank Newbie
Rank
Newbie

You can think of an 8-bit paletted texture as a simple, non-animated gif image, and a video card as a bitmap editor: if the bitmap editor does not natively support these lightweight images that come bundled with their own color palette, they will be automatically converted to 24-bit before they can be touched any further.

Problem is, going from 8-bit to 24-bit adds 2 bytes to each pixel, making the image much bigger and thus a greater amount of memory is required in order for them to be processed.

Put another way, a video card that supports paletted textures doesn't need to convert smaller, 8-bit textures to 24-bits therefore less video memory is required.

But that's not all there's to it. If the paletted texture does have transparency, it can't just be converted to 24-bit because there's no place where to slap the transparency information back into the picture, which could be easily resolved by going 32-bit instead (the extra byte being the alpha channel that replaces the transparency in the 8-bit texture), or coming up with a custom texture format or God knows what else, but as it turns out, a workaround, elegant or not, hasn't been available since the GeForce FX, and anything afterwards will not handle paletted textures correctly, leaving you with nasty, black outlines that totally destroy the edge of the sprites, sprites which are generally handled as any other texture at the lowest level.

Reply 9 of 553, by rob8086

User metadata
Rank Newbie
Rank
Newbie

Oh, wow. Thank you for that explanation. After reading this, I have like... zero interest in another ATI build. Haha

Corruptor : ASUS TUV4X - Intel SL6BY @ 1.4GHz - 2x 256MB PC133 - ATI Radeon 8500 - Creative Sound Blaster Audigy2 ZS SB0350
Aggressor : ASUS TUSL2 - Intel SL6BY @ 1.4GHz - 2x 256MB PC133 - NVIDIA GeForce 4 Ti 4600 - Creative Sound Blaster Audigy2 ZS SB0350

Reply 10 of 553, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Well going 24-bit/32-bit wasn't really an option for most paletted textures in their heyday combined with the APIs and the common 3d cards then. Most games took the RGBA4444/RGB565 approach with significant color precision loss (though some argued back then 16-bit textures meant higher quality)

DosDaddy wrote:

But that's not all there's to it. If the paletted texture does have transparency, it can't just be converted to 24-bit because there's no place where to slap the transparency information back into the picture,.

It's trivial to convert to 32-bit RGBA while reading a chroma key index.... it's exactly what GLQuake does for the menus and sprites.

apsosig.png
long live PCem

Reply 11 of 553, by KCompRoom2000

User metadata
Rank Oldbie
Rank
Oldbie

So now that we know the possibility on how widespread 8-bit palleted textures are, I'm guessing that either the method on how it was handled changed somewhere between DirectX 5 and 6 or the cards that explicitly lack support for it (e.g. Radeons and Geforce 6xxx+) render them as 24/32-bit textures (or both)?

I'm not to worried about table fog, I did the registry patch to re-activate it on my Radeon 7200 and I can see fog in the weather effects in Monster Truck Madness 2 (assuming that game uses table fog). If by the off-chance I see a need to use an nVidia card instead, I'll be sure to secure a Geforce3 or 4Ti for my parts collection just in case (old parts won't stay in the market forever, you know).

Reply 12 of 553, by dr.zeissler

User metadata
Rank l33t
Rank
l33t

It seems that a voodoo1 in a faster systems (>400Mhz) has problems with the "8-bit Paletted Textures" in Direct3D, in Glide they are stille fine.
It can be checked with a voodoo1 in forsaken when changing from RGBA to "8Bit paletted". Mageslayer, Motoracer do have that issue in d3d.

Retro-Gamer 😀 ...on different machines

Reply 13 of 553, by Scali

User metadata
Rank l33t
Rank
l33t

Here is an explanation of table fog, also known as pixel fog: https://docs.microsoft.com/en-us/windows/desk … p/direct3d9/fog
In short, it was a simple trick, relatively cheap to implement in fixed-function hardware, to convert the depth of a pixel to a 'fog factor', and use a table to blend the pixel colour to get an illusion of fog.
Since the fog implementation is driver-dependent, the results may vary from one video card to the next.

The feature has been superceded by pixel shaders, where a detailed fog effect can be coded directly, in a driver-independent way.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 14 of 553, by W Gruffydd

User metadata
Rank Newbie
Rank
Newbie
firage wrote:

Even the contemporary NVIDIA gear predating the GeForce 256 ... only supported table fog through emulation.

Correct. The RIVA 128, RIVA128ZX, and the RIVA TNT used vertex fog to emulate table fog until the GeForce 256, which supported table fog in hardware. This Nvidia white paper will tell you more.

It would hard be to overstate the utility for retro gamers of a visual feature list for games, accompanied by a hardware list of visual capabilities for applicable APIs.

My list of wanted hardware

Reply 15 of 553, by tpowell.ca

User metadata
Rank Member
Rank
Member
W Gruffydd wrote:

It would hard be to overstate the utility for retro gamers of a visual feature list for games, accompanied by a hardware list of visual capabilities for applicable APIs.

I agree.

So, if I understand correctly, Radeon cards are a no-go as none in any generation supported these features?
As for nVidia, the best cards for legacy compatibility sound like Geforce 256 to the Geforce 4 range.

Is this right?

Does the entire 3dfx lineup support these features natively?

  • Merlin: MS-4144, AMD5x86-160 32MB, 16GB CF, ZIP100, Orpheus, GUS, S3 VirgeGX 2MB
    Tesla: GA-6BXC, VIA C3 Ezra-T, 256MB, 120GB SATA, YMF744, GUSpnp, Quadro2
    Newton: K6XV3+/66, AMD K6-III+500, 256MB, 32GB SSD, AWE32, Voodoo3

Reply 16 of 553, by swaaye

User metadata
Rank l33t++
Rank
l33t++

Early Radeons can do table fog with a registry tweak. They didn't officially support it. Radeon R100-RV280 can definitely do it. No palettized texturing though.

GeForce cards prior to GF4 have ugly S3TC/DXT1 due to a lack of dithering with it. The GF4 Ti is great for compatibility with a lot of D3D and OpenGL games. The GeForce FX cards are nice too, except 5700 is too new to run the more compatible driver 45.23.

Reply 17 of 553, by Ozzuneoj

User metadata
Rank l33t
Rank
l33t

So, if the FX series is the latest series to fully support these features, would the NV38GL based Quadro FX 1300 be useful since it is both very fast (for an FX card) and also PCI-E based? Would make for an interesting card to drop into a PCI-E system for full compatibility with nearly everything made before 2004 or so. I know it uses a bridge chip though, so that could cause issues with later boards.

How would driver support be for a card like that? How about with a modded .INF?

Now for some blitting from the back buffer.

Reply 18 of 553, by leileilol

User metadata
Rank l33t++
Rank
l33t++
W Gruffydd wrote:

It would hard be to overstate the utility for retro gamers of a visual feature list for games, accompanied by a hardware list of visual capabilities for applicable APIs.

Tests like:

- a single 16 gray texel scaled up and modulated up, to expose filter precision
- a basic room with colored mip levels to show the standard lod calculation and bias
- some really additive-blended overdraw of a dark texel to bring out the 16-bit color dithering matrix
- Some 1x1, 2x2, 4x4, 8x8 and 16x16 textures for testing the minimum texture sizes allowed
- Some rectangle texture tests to see if those are allowed
- Texture clamping modes
- a reminder that no 3d card is perfect in any of these results and they are purely subjective tastes

apsosig.png
long live PCem

Reply 19 of 553, by vlask

User metadata
Rank Member
Rank
Member

I remember that X-Wing Alliance been working fine till my FX5900XT. Once i replaced it with 6600GT i got scrambled mostly unreadable text in missions. Later Nvidia cards had same issues. Not sure it source of problem is the same....

Not only mine graphics cards collection at http://www.vgamuseum.info