VOGONS


Table Fog & 8-bit Paletted Textures

Topic actions

Reply 300 of 553, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
BEEN_Nath_58 wrote on 2022-09-05, 18:38:

Can you try this game with a Table Fog compatible chip and see how the game reacts to distance (and and relevant screenshots for proper visible fog which I didn't find in vertex one).

Not sure if this was addressed at me or others in general, but I don't own that game and I'm not familiar with it. That said, if you (or anyone else) wants to test table fog and paletted texture support in certain games, all that's needed are two cheap and readily available graphics cards.

A GeForce 2 MX400 fully supports both table fog and paletted textures, and can use early Nvidia drivers like 7.76 which makes it very compatible with older games. The other card would be an ATi Radeon 9250 which doesn't support paletted textures nor table fog. Simply run the game that you want to test on both cards, compare the results, and document your findings with screenshots.

I can add any newly tested games to the Vogons wiki, provided that the tests are conducted on real hardware (no emulation or wrappers) and that proper comparison screenshots are presented.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 301 of 553, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t
Joseph_Joestar wrote on 2022-09-06, 03:38:
Not sure if this was addressed at me or others in general, but I don't own that game and I'm not familiar with it. That said, if […]
Show full quote
BEEN_Nath_58 wrote on 2022-09-05, 18:38:

Can you try this game with a Table Fog compatible chip and see how the game reacts to distance (and and relevant screenshots for proper visible fog which I didn't find in vertex one).

Not sure if this was addressed at me or others in general, but I don't own that game and I'm not familiar with it. That said, if you (or anyone else) wants to test table fog and paletted texture support in certain games, all that's needed are two cheap and readily available graphics cards.

A GeForce 2 MX400 fully supports both table fog and paletted textures, and can use early Nvidia drivers like 7.76 which makes it very compatible with older games. The other card would be an ATi Radeon 9250 which doesn't support paletted textures nor table fog. Simply run the game that you want to test on both cards, compare the results, and document your findings with screenshots.

I can add any newly tested games to the Vogons wiki, provided that the tests are conducted on real hardware (no emulation or wrappers) and that proper comparison screenshots are presented.

I would like to, but it's not easy to get retro HW in India. During that time, we had to import product instead of manufacturers releasing it widely in the country. And the VAT is huge to import products, not to mention I will need an entire system to build so for simple purposes to check game visuals, this doesn't seem economical.

The game is very easy to play as well, and you can get the effects of Fog in first level only.

With that said, I sent you a PM.

previously known as Discrete_BOB_058

Reply 303 of 553, by Kahenraz

User metadata
Rank l33t
Rank
l33t

Although it doesn't work with older drivers like the GeForce 2, the GeForce 4 MX series is based on the se architecture and is an excellent choice for DirectX 7 games. They are also very cheap and ubiquitous on the used market.

Reply 304 of 553, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t

I had a go with my Intel GPU with Thief 2 in Mission 1, to check if the starts render (8 bit pallettized textures). The results were surprising.

By default, every graphics setting is set to the highest possible and the game looks like this:
file.php?mode=view&id=144993

After setting sky detail to Low it looks like this
file.php?mode=view&id=144994

After setting sky detail back to high it looks like this
file.php?mode=view&id=144995

Is this some sort of a game bug or driver bug or is my Intel driver doing some magic. Another game Forsaken also uses 8 bit pallettized textures and it displays wrong.

Attachments

previously known as Discrete_BOB_058

Reply 305 of 553, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
BEEN_Nath_58 wrote on 2022-09-09, 07:22:

I had a go with my Intel GPU with Thief 2 in Mission 1, to check if the starts render (8 bit pallettized textures).

The Thief 2 stars are not paletted textures. This was our initial assumption, but it was proven wrong through further testing and cross-referencing the findings with old posts on the TTLG forums. The missing stars are a separate, unique issue which seems to primarily occur on Nvidia GeForce cards and up. The stars render correctly on most other cards (ATi, Matrox, 3DFX, S3 Savage) and even on Nvidia's TNT2.

Currently, the best test for paletted texture support is Final Fantasy 8.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 306 of 553, by Am386DX-40

User metadata
Rank Member
Rank
Member

What an interesting thread. Lots of information and an awesome number of cards/chips tested! Can anyone test an i740? It's supposed to be super compatible with everything, even if the performance is not that much up to par.

Reply 307 of 553, by Kahenraz

User metadata
Rank l33t
Rank
l33t

The issue with the stars on NVIDIA cards may be a mipmap issue. See here for how I tweaked a few settings to fix the HUD in Incoming, a DirectX 5 game.

Mipmap settings that fix Incoming (DirectX 5) on the GeForce FX

Do the stars fail to render properly on a TNT2 or only GeForce and later? It's worth taking a look at.

Reply 308 of 553, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Kahenraz wrote on 2022-10-14, 23:56:

The issue with the stars on NVIDIA cards may be a mipmap issue. See here for how I tweaked a few settings to fix the HUD in Incoming, a DirectX 5 game.

Mipmap settings that fix Incoming (DirectX 5) on the GeForce FX

Do the stars fail to render properly on a TNT2 or only GeForce and later? It's worth taking a look at.

The stars render correctLy on the TNT2. They are missing on all GeForce cards, from the original one onward.

Interesting find about the mipmaps. Might be worth checking if this fixes the Thief 2 issue as well.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 309 of 553, by Kahenraz

User metadata
Rank l33t
Rank
l33t

That's the same generation that all of the textures broke in Incoming. If we're lucky, this may actually fix it.

I haven't ever tried Thief. What do I need to do to get to this magical "star" scene to try it out?

Reply 310 of 553, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Kahenraz wrote on 2022-10-15, 02:28:

I haven't ever tried Thief. What do I need to do to get to this magical "star" scene to try it out?

You need the retail version of Thief 2 + patch 1.18. The GOG release is unsuitable since it comes with a fan made modification pre-applied.

Additional instructions and save games are available here.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 312 of 553, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Kahenraz wrote on 2022-10-15, 18:54:

Spotted this in the Permedia 2 driver panel.

Already documented a few months ago.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 313 of 553, by Kahenraz

User metadata
Rank l33t
Rank
l33t
Joseph_Joestar wrote on 2022-10-15, 02:12:

The stars render correctLy on the TNT2. They are missing on all GeForce cards, from the original one onward.

Interesting find about the mipmaps. Might be worth checking if this fixes the Thief 2 issue as well.

I tried this with various mipmap options, as well as moving the texel position around, but wasn't able to make it work.

It's possible that modifying the original texture could provide another means for a workaround. I noticed that an editor was included on the install CD.

I think the reason this is so variable on different hardware is that it's an alpha blended mipmap that is also 16-bit dithered. All of those things seem to vary between each manufacturer, as it's a combination of several very complicated functions that appeared only briefly in combination before the focus shifted to 32-bit. Any 16-bit validation was probably an afterthought by this point.

Reply 314 of 553, by Joseph_Joestar

User metadata
Rank l33t
Rank
l33t
Kahenraz wrote on 2022-10-20, 10:49:

I think the reason this is so variable on different hardware is that it's an alpha blended mipmap that is also 16-bit dithered. All of those things seem to vary between each manufacturer, as it's a combination of several very complicated functions that appeared only briefly in combination before the focus shifted to 32-bit. Any 16-bit validation was probably an afterthought by this point.

Interesting findings.

BTW, I haven't noticed this particular issue on non-Nvidia cards, and it seems to mostly affect their GeForce line. Cards from ATi, 3DFX, Matrox and S3 Savage all render the stars correctly. For more modern hardware, there is already a fan-made patch which fixes this, along with some other bugs.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Athlon64 3400+ / Asus K8V-MX / 5900XT / Audigy2
PC#4: i5-3570K / MSI Z77A-G43 / GTX 970 / X-Fi

Reply 315 of 553, by Kahenraz

User metadata
Rank l33t
Rank
l33t

Everyone had their own rules for optimization and had their own method of dithering and filtering. 3DFX was excellent at anything 16-bit, as it was a major point of their hardware and marketing, but lacks 32-bit until the Voodoo 4. ATI is missing fog in many games, has great filtering, but and has *terrible* dithering. Matrox is fine for 3D, as long as it's the G400 or G450. The Savage is also fine. But I don't see a good reason to choose either of these over a TNT2.

There are good reasons to choose any of these, depending on the game. Unfortunately, none of them cover all of the bases. Other than a Voodoo, I still think that the GeForce FX is still the best choice for most games around this era, on Windows 98.

Reply 316 of 553, by dr.zeissler

User metadata
Rank l33t
Rank
l33t

Intel i74x sucessors are excellent for retro-gaming like i815e/845 the are qualitywise on pair with matrox! but they are BETTER!

i815e = OS/2 3/4, Win3x, Win9x (D3D/OGL), 2K/XP (D3D/OGL), Linux (OGL) (TableFog and 8bitPal should be supported on i815e)
i845 = OS/2 4, Win9x (D3D/OGL) inkl. S3TC, 2K/XP (D3D/OGL) inkl. S3TC, Linux (OGL), Amithlon
i865 = Win9x (D3D/OGL) inkl. S3TC, 2K/XP (D3D/OGL) inkl. S3TC , Linux (OGL), Amithlon

Retro-Gamer 😀 ...on different machines

Reply 317 of 553, by Kahenraz

User metadata
Rank l33t
Rank
l33t

I was curious enough to test this out with Thief 2 with an Intel Extreme 2 (Intel 845G) on Windows 98. I see stars, but there is no fog. Mipmap filtering also makes the interface of Incoming (DirectX 5) unreadable, unless all filtering and mipmapping is disabled.

There aren't any configurable options for DirectX in the Intel control panel, and I couldn't find any tools to modify any of the settings.

Last edited by Kahenraz on 2022-10-21, 06:16. Edited 1 time in total.

Reply 318 of 553, by leileilol

User metadata
Rank l33t++
Rank
l33t++

Look at my Kyro2 Thief shot earlier in this thread if you want a good 'reference' as that's a 32-bit image getting 4x4 order dithered down to 16-bit for the buffer with no compromises at the texture units, blending or whatever. The filtering can get a little blocky though and Geforce suffers a similar anomaly.

(Kyro2, of course, has a lot of its own other weaknesses regarding palettes, and depth buffers. ESPECIALLY depth buffers)

apsosig.png
long live PCem

Reply 319 of 553, by Kahenraz

User metadata
Rank l33t
Rank
l33t

I haven't tried the Kyro in Thief, but I was not impressed with how my 4500 rendered the interface in Incoming. This game always seems to be a very odd corner case for a lot of graphics cards, probably because it's an older DirectX 5 title that didn't get any regression testing once DirectX 6 came out, and was long forgotten by version 7.

For Intel, it seems that the early GMA hardware was capable of table fog, but only in later driver versions. For my i845G, the latest driver version for Windows 9x is 4.14, and Intel says that this feature was not available until 6.4.

Pixel fog (fog table) support requires version 6.4 or later of the graphics drivers for the Intel® 810 and 815 Chipset families. Linear fog is supported, but not pixel fog (fog table).

https://www.intel.com/content/www/us/en/suppo … 8/graphics.html

Attachments