That makes no sense, scaling shouldn't have much to do with input. It does fix it though 😜
If you know what's going on, it would be interesting to me.
Well, I don't have a clue... 😀
I just wanted to introduce the ability of choosing all the scaling modes available in DXGI, minding that 'centered' and 'scaled' might drives the display driver wrong, and, 'unspecified' might means a default scaling mode set on the control panel.
But, I found interesting info on scaling mode hell on an Intel forum:
Note that when DX10+ games perform mode set during full screen mode, the game itself actually explicitly specifies the scaling o […]
Show full quote
Note that when DX10+ games perform mode set during full screen mode, the game itself actually explicitly specifies the scaling option as part of the mode set that it does...
DXGI_MODE_DESC structure (Windows)
this includes a DXGI_MODE_SCALING enum value... (options are "unspecified", "centered", "stretched" - no enum values to distinguish between stretched full screen vs stretched maintain aspect ratio...)...
Many DX10/11 games build their mode list by asking the graphics driver what modes the game supports (IDXGIOutput::GetDisplayModeList)... which returns an array of these DXGI_MODE_DESC data structures... this could wind up listing a particular resolution (say 1366x768) multiple times - with different refresh rates, color buffer formats, scaling options. The game then puts bits and pieces of that list into the game settings menu.... maybe only resolution gets displayed in a drop down. Typically scaling isn't listed at all. The user selects a resolution from that list and the game choose ONE of the matching DXGI_MODE_DESCs for that resolution to send to the set display mode call - who knows which (maybe the one with scaling "unspecified"... the scaling option specified in that option could be any of the above.. or the app constructs a new DXGI_MODE_DESC from scratch based on the user's resoluton selection - possibly leaving the scaling value as "unspecified"
What was happening in the case with DX10/11 games on Win8.1 was that that apps were sending "unspecified" scaling and the OS is converting that to "centered" during the mode set. On earlier OS (and on DX9), this didn't happen - "unspecified" is passed through to the display driver unmodified.
https://communities.intel.com/message/263334
So, if I get it right it's a Windows problem and the only way for achieving 'stretched with ascpect ratio' is forcing that trough the driver control panel. 😵
And, additional info that 'unspecified' always converted to 'stretched' for me on Win7 so it's not even a Win8 only issue. But as the cited post states, it works fine for games using pre-DX10.
Septerra Core and Shogo both fail with graphic glitches. Shogo might be interesting as there looks to be a few things wrong with it. Not important games since they both run nicely on modern OSes (well not out of the box).
I've already fixed Shogo and Blood2 except 2 issues: permanent crash when losing window focus and something wrong with crosshair. The first one could only be solved by some low-level hacking in the wrapper, but when I fixed those, I think I will release a new version to have an official one. (Also, Incoming is working fine now.)
I would like to fix others after that.
It's the 8bit-paletted texture decade-old problem 😁
Only Intel GPUs today have the support for that old standard.
I think it's not: the rendering target buffer itself is 16 bit so using 8 or 32 bit textures does not make too much difference in that case (maybe it does if multitexturing is in usage).
Also, supporting paletted textures of old hardware was scant: AFAIK ATI didn't support it at all, nVidia ditched that with the end of Geforce FX series (don't know anything about Intel though).
I remember, when was developing dgvoodoo 1 then my FX5700 supported paletted textures through DX7 but when I asked for the same through DX9 then it lied it's unsupported. 😀
It was like an intent to push developers toward ditching paletted textures by force. Also, it was the exact same situation when asking for W-buffering support. It worked through DX7 but not through DX9. ATI didn't supported that at all, too.
Resolution changes in Glide or DX don't change in-game (stays native resolution)
It's unsupported for DX but should work for Glide. I may have broken something, I will check it.
though it is pretty known that DX10 generation hardware ditched 16bpp dithering.
Yes, exactly. I get ugly color banded appearance by native DX with games using 16 bit modes (like Shogo and Blood2).