VOGONS


First post, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t

So I was running a screen mode enumerator program and it wasn't listing any R5G6B5 formats, so I knew I needed to enable 16-bit color compatibility, but it did nothing. Reading the sources, I saw it used Direct3D9Ex. The same program's D3D8 variant would enumerate R5G6B5 modes with the windows compatibility mode.

Is the color compatibility mode limited to plain D3D9, or just D3D8? And does it work with things like old OpenGL (and is there anything else left like GDI+?)

previously known as Discrete_BOB_058

Reply 1 of 2, by Scali

User metadata
Rank l33t
Rank
l33t

Interesting question... I have found nothing online that documents what this setting is actually supposed to do, and what it is supposed to affect and how.
From what I understand, since Windows 8, display drivers no longer *have* 16-bit desktop modes. As such, you cannot enumerate them.
This would imply that you also cannot select a 16-bit videomode.
So what I suspect this setting does, is that it reports 16-bit modes to certain APIs, yet since the modes do not actually exist, it will probably insert a conversion step to 32-bit (or whatever your desktop actually uses) at some point during the rendering process.

Which leaves the question: which API is enumerating what modes?

I did a quick test with Direct3D9 under Windows 11, and I found that I can only set the compatibility mode for reduced color on a 32-bit EXE, not on a 64-bit EXE.
But when I built my code for 32-bit, I could enumerate 16-bit modes even without specifically setting the option.
The same modes are not enumerated in the 64-bit build.

Repeating the same test under Direct3D9Ex was interesting: it does not enumerate any 16-bit modes now, even though I used the exact same code, aside from replacing the Direct3DCreate9()-call with a Direct3DCreate9Ex()-call.
So apparently there is something about the API you use. D3D9 works, D3D9Ex does not (I suppose D3D9Ex has never supported 16-bit modes in the first place, and neither have DX10 and higher).

http://scalibq.wordpress.com/just-keeping-it- … ro-programming/

Reply 2 of 2, by BEEN_Nath_58

User metadata
Rank l33t
Rank
l33t
Scali wrote on 2023-06-27, 11:31:
Interesting question... I have found nothing online that documents what this setting is actually supposed to do, and what it is […]
Show full quote

Interesting question... I have found nothing online that documents what this setting is actually supposed to do, and what it is supposed to affect and how.
From what I understand, since Windows 8, display drivers no longer *have* 16-bit desktop modes. As such, you cannot enumerate them.
This would imply that you also cannot select a 16-bit videomode.
So what I suspect this setting does, is that it reports 16-bit modes to certain APIs, yet since the modes do not actually exist, it will probably insert a conversion step to 32-bit (or whatever your desktop actually uses) at some point during the rendering process.

Which leaves the question: which API is enumerating what modes?

I did a quick test with Direct3D9 under Windows 11, and I found that I can only set the compatibility mode for reduced color on a 32-bit EXE, not on a 64-bit EXE.
But when I built my code for 32-bit, I could enumerate 16-bit modes even without specifically setting the option.
The same modes are not enumerated in the 64-bit build.

Repeating the same test under Direct3D9Ex was interesting: it does not enumerate any 16-bit modes now, even though I used the exact same code, aside from replacing the Direct3DCreate9()-call with a Direct3DCreate9Ex()-call.
So apparently there is something about the API you use. D3D9 works, D3D9Ex does not (I suppose D3D9Ex has never supported 16-bit modes in the first place, and neither have DX10 and higher).

Thanks for the tests. I performed my tests with vmodes8/9, present in DxWnd source code.

Generally, DirectDraw applications would enumerate 8/16/32 bit modes on first run, but then they would go missing (some shim does it). The compatibility tab settings could bring them back, and so would DxWnd's basic Video tab color modes.

Now comes D3D8 and D3D9 tests. To my eyes, I could only see Direct3D9Ex inside the vmodes9 d3d9 source code;on running it would refuse to register the RGB565 modes at all, even with the compatibility setting.

However DxWnd could itself bring the 16-bit RGB565 modes back with it's Direct3D/Enumerate 16-bit modes setting.

The possibilities are many here: did Windows not implement 16 bit modes for emulation (either directly or compatibility mode)?; did DxWnd invent something new, that other than D3D9Ex, would even bring back 16 bit color mode on D3D10 (i am not sure if it supports RGB565)?

There was R5G6B5 modes for D3D8 on first run at least.

16 bit color is "no longer" converted to 32 bit color (A8R8G8B8?), it used to in 2021, I talked about it with the dev of DDrawCompat but even today while playing Lego Island, I could see clear color banding everywhere. Back then DDrawCompat would do 16 bit color and native ddraw would do 32 bit conversion, and now the exact opposite happens.

Update: I checked that vmodes9 d3d9ex application on Win7 and it lists 16-bit modes... So I guess MS removed 16-bit color for D3D9Ex on Win10/11?

previously known as Discrete_BOB_058