VOGONS


First post, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t

In modern GPU drivers, most (if not all) game that rendered in 16-bit color, now usually render in 32-bit color mode. My Nvidia driver reverted to rendering 16-bit in 16-bit only in 2022, and now again reverted to 32-bit rendering. This is not a problem, DxWnd directly provides the option so I can access both of them at a fly.

My question is whether this 32-bit color was embedded in applications, and only accepted if drivers requested it? Or do modern drivers have a smart way of rendering such colors in 32-bit on its own. Here's a comparision of such in Half-Life:

previously known as Discrete_BOB_058

Reply 1 of 10, by Scali

User metadata
Rank l33t
Rank
l33t

I'm not entirely sure what you're asking.
But I can tell you this: it's not the driver that asks the application, it's the other way around.
The application can enumerate all video modes and formats that are supported by the driver, and then pick the best one.
Many games could run in either 16-bit or 32-bit mode.
The 16-bit support in drivers these days is only for software that will only pick 16-bit modes, and ignores 32-bit or other modes.
In the early days you'd want 16-bit textures and framebuffers because it required less memory and bandwidth. But on modern hardware, there's no reason not to run in 32-bit mode if the game supports it.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 2 of 10, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++

There were some Win9x games like Thief 1 & 2 which only supported 16-bit color rendering. To clarify, I'm talking about the retail CD versions of those games with only the official patches applied. Some of the fan-made fixes created over the years (e.g. T2Fix) now offer 32-bit color rendering as well, but the stock games didn't have that option.

System Shock 2 and Kiss: Psycho Circus also only used 16-bit colors. There are other games with similar behavior, mostly early 3D accelerated titles that were made before 32-bit color rendering became the norm. Also, Glide-only games were always rendered in 16-bit color because 3DFX cards didn't support 32-bit color rendering until the Voodoo 4.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 3 of 10, by wierd_w

User metadata
Rank Oldbie
Rank
Oldbie

The NewDark engine does more than 32bit color for those game.

However, you might be able to force it with dgvoodoo2. It does directx wrapping also.

Reply 4 of 10, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t
Scali wrote on 2023-08-08, 17:39:

But I can tell you this: it's not the driver that asks the application, it's the other way around.
The application can enumerate all video modes and formats that are supported by the driver, and then pick the best one.
Many games could run in either 16-bit or 32-bit mode.

Many wrapper, viz DDrawCompat and dgVoodoo2 apparently force 32-bit rendering on applications. That's what is causing the doubt, how that got implemented?

Joseph_Joestar wrote on 2023-08-08, 17:48:

There were some Win9x games like Thief 1 & 2 which only supported 16-bit color rendering.

I checked Thief Gold on Win11 and I can verify it only does 16-bit color rendering.

Do you have any idea when games started using 32-bit rendering by default (suppose Half-Life)? And if there was a way back then to choose between 16 and 32 bit modes myself (driver related?)?

wierd_w wrote on 2023-08-09, 04:30:

The NewDark engine does more than 32bit color for those game.

However, you might be able to force it with dgvoodoo2. It does directx wrapping also.

What do you mean by more that 32-bit color...?

previously known as Discrete_BOB_058

Reply 5 of 10, by wierd_w

User metadata
Rank Oldbie
Rank
Oldbie

I mean that the old engine uses DX7 (maybe even dx5? -lgntforce switch on the installer will force install on nt4, and it tops out at dx5) api interfaces, and NewDark uses DX9 interfaces.

It does 32bit color modes, yes, but the 3D performance is way better too.

Reply 6 of 10, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
BEEN_Nath_58 wrote on 2023-08-09, 06:06:

I checked Thief Gold on Win11 and I can verify it only does 16-bit color rendering.

Do you have any idea when games started using 32-bit rendering by default (suppose Half-Life)?

It wasn't exactly clear cut. Some games from 1997 like Tomb Raider 2 offered 32-bit color rendering, while other games released in 2000 like Thief 2 only supported 16-bit color rendering.

Historically, most graphics cards made before 1999 didn't have the power to deliver 32-bit color rendering in 3D accelerated games at an acceptable frame rate, so I'd say that year can serve as a rough guideline. But again, it varies on a per-game basis.

And if there was a way back then to choose between 16 and 32 bit modes myself (driver related?)?

Normally, you would choose the resolution and color depth from within the game, or sometimes from its external configuration program. However, in some rare cases, a game would render at whatever color depth your desktop was set to. This sometimes applied to games which used the Quake engine. Lastly, older Nvidia drivers had an option to always use 32-bit color depth for OpenGL rendering, but I don't think such a setting ever existed for Direct3D games.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 7 of 10, by Scali

User metadata
Rank l33t
Rank
l33t
BEEN_Nath_58 wrote on 2023-08-09, 06:06:

Many wrapper, viz DDrawCompat and dgVoodoo2 apparently force 32-bit rendering on applications. That's what is causing the doubt, how that got implemented?

I think it's a simple translation layer: the application requests 16-bit buffers, but the translation layer converts that to 32-bit before passing on to the driver.

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 8 of 10, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t
Joseph_Joestar wrote on 2023-08-09, 06:31:

It wasn't exactly clear cut. Some games from 1997 like Tomb Raider 2 offered 32-bit color rendering, while other games released in 2000 like Thief 2 only supported 16-bit color rendering.

Historically, most graphics cards made before 1999 didn't have the power to deliver 32-bit color rendering in 3D accelerated games at an acceptable frame rate, so I'd say that year can serve as a rough guideline. But again, it varies on a per-game basis.

But they got the option inside the game, and for whom, future generations (TR2)? 💁

Joseph_Joestar wrote on 2023-08-09, 06:31:

Normally, you would choose the resolution and color depth from within the game, or sometimes from its external configuration program. However, in some rare cases, a game would render at whatever color depth your desktop was set to. This sometimes applied to games which used the Quake engine. Lastly, older Nvidia drivers had an option to always use 32-bit color depth for OpenGL rendering, but I don't think such a setting ever existed for Direct3D games.

Hmm my main concern was Direct3D only, since I never saw how that was determined. Now OpenGL doesn't even ask, it directly does 32-bit rendering, but Direct3D is still conflicting to this day.

Scali wrote on 2023-08-09, 08:25:

I think it's a simple translation layer: the application requests 16-bit buffers, but the translation layer converts that to 32-bit before passing on to the driver.

Is it always colour conversion? Games like Premiere Manager 97 render the game with wrong colors in the way AERO affects DDraw games. It was fixed by feeding it with a NUMCOLORS=20 value. So driver was attempting something incorrect in the translation there...

previously known as Discrete_BOB_058

Reply 9 of 10, by Joseph_Joestar

User metadata
Rank l33t++
Rank
l33t++
BEEN_Nath_58 wrote on 2023-08-09, 15:08:

But they got the option inside the game, and for whom, future generations (TR2)? 💁

The ATi Rage Pro card which came out in late '97 could render in 24-bit color (maybe also 32-bit, not sure) and the developers of Tomb Raider 2 may have been aware of that. The original Nvidia TNT was also just around the corner (mid 1998) and that card could definitively render in 32-bit color depth, so it may have been a target too.

But yeah, the developers were likely just future-proofing the game. And it was great that they went the extra mile, because that allowed people to play the retail version of Tomb Raider 2 at 1600x1200 with 32-bit colors using hardware that was released only a couple of years later.

PC#1: Pentium MMX 166 / Soyo SY-5BT / S3 Trio64V+ / Voodoo1 / YMF719 / AWE64 Gold / SC-155
PC#2: AthlonXP 2100+ / ECS K7VTA3 / Voodoo3 / Audigy2 / Vortex2
PC#3: Core 2 Duo E8600 / Foxconn P35AX-S / X800 / Audigy2 ZS
PC#4: i5-3570K / MSI Z77A-G43 / GTX 980Ti / X-Fi Titanium

Reply 10 of 10, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t
Joseph_Joestar wrote on 2023-08-09, 15:47:
BEEN_Nath_58 wrote on 2023-08-09, 15:08:

But they got the option inside the game, and for whom, future generations (TR2)? 💁

The ATi Rage Pro card which came out in late '97 could render in 24-bit color (maybe also 32-bit, not sure) and the developers of Tomb Raider 2 may have been aware of that. The original Nvidia TNT was also just around the corner (mid 1998) and that card could definitively render in 32-bit color depth, so it may have been a target too.

But yeah, the developers were likely just future-proofing the game. And it was great that they went the extra mile, because that allowed people to play the retail version of Tomb Raider 2 at 1600x1200 with 32-bit colors using hardware that was released only a couple of years later.

Okay so I figured out some of the color upgradations in new Windows is done by the DWM8AND16BITMITIGATION shim (verified by Narzoul), so that explains my strange discovery of the game rendering with 32-bit colors while never requesting 32-bit colors. And then there's the other set of games which had a hidden higher color mode available such as HL2 or TR2.

previously known as Discrete_BOB_058