Reply 20 of 22, by spiroyster
- Rank
- Oldbie
wrote:If it was there, then wasn't, then was again
Once again - you can't get dithering on Fermi, Kepler and Maxwell GPUs with fresh drivers. Period. So it seems like anything after G80 and before Pascal series just can't render 16-bit modes.
Also you still get dithering on any card before G80 with driver that support it.
Doesn't surprise me o.0
wrote:I can't imagine the rationale behind re-implementing this feature in these kinds of modern architectures.
They can render 16-bit modes just fine. It's not the input (16-bit colours specified by the program) that is important, its the display device colour depth/buffer size (gfx card) and display capabilities that are important which will probably be fixed precision (but not fixed at 16-bit). Older cards with 16-bit buffers displaying on CRT screens (which can display many more colours) had to do something to mitigate the inevitable banding so they implemented hardware dithering (which while related to 16-bit is not a by-product of rendering 16-bit). Developer could still turn off dithering and perform the dithering processing via CPU on the buffer as they desired, but why bother if there is accelerated hardware dithering available to you.... or the hardware ignores your request to turn off dithering (not so much conforming in the standards back then)... or decide that banding is the way (cell shading used it to good effect, through fragment shaders though o.0) and turn off dithering and degrade everything to 16-bit gamut, maybe even 8-bit posterized!. Also textures themselves can be dithered in advanced by an artist, as well as dithering happening as part of the graphics pipe. Personally, it's not something that I have had to worry about other than... "ah bollocks I can't seem to create a 24-bit context, back to the white book (would be red book these days)."
Different types of dithering:
http://www.tannerhelland.com/4660/dithering-e … ms-source-code/
Agree with you though, it would appear they have removed 'default' dithering from later cards/drivers. Yes this is a problem when playing old games since the default position of the display driver now, is to not dither an output that the original developers perhaps anticipated would be dithered without them doing anything themselves. At the same time given the 'general processing' capabilities, there would be little reason to implement a dedicated dithering in the hardware, rather use the massive capabilities of the GPU to do this (some dithering techniques are embarrassingly parallel since all that is needed for is the fragment colour, and the x,y location in the frame).
@swaaye
</pure speculation>
From the drivers point of view, the simplest thing to do is simply accept the colour values being pushed through the API as 16-bit colours, convert each channel to 8-bit (or cards/GPU native precision), and then simply raster with the full 24-bit gamut that would no doubt be supported (if not native) by the frame buffer. This would present a 24-bit frame, it's just that the colours when first defined were limited by 16-bit gamut. However, given the now larger range of colours, textures would still be visibly lower quality, while geometry and lighting calculations would be higher giving a rather strange result, and god knows how various AA then applied would look o.0. The wins of using a 24-bit gamut in this case is lost since, while texture filtering may eradicate some banding, the 16-bit lower precision of the texture gamut would still be visible, and more noticeable when certain conditions are met.
The second simplest thing to do would be to simply interpolate and downgrade the entire frames colour gamut to 16-bit depth. Values would have a possible 32-bit depth gamut/range, but no colour value would be present in the frame which cannot be represented by 16-bit gamut. This would present the game as closest as possible to the original.... without dithering...what your screen shots seem to suggest 😀
In both cases, it is assuming that 32-bit frame buffer WILL be used. This is now the case for all windows 8.1/10 windowed contexts o.0. Can't confim any behavior with full-screen though which may have more flexability with choosing pixel formats.
Dithering is simple to implement, while the market for it on x86 workstations/desktops isn't huge (probably pretty much vogon-esque only, I can't think of another market for it on our systems, other than printing (but this requires more flexibility with dithering with image data, rather than relying on display dithering) maybe some image analysis may benefit from its signal reduction???) I can understand why its not here anymore. And certainly, reshade would do it, and it should look pretty similar (if not the same) using the same dithering algorithm... however something to maybe note, as mentioned in this Image Quality of various old video cards (Quake 3 comparison), VGA may present a softer image, which means higher resolution dithering will be a lot more effective on old CRT through VGA than DVI/FP crisp-ness.
</pure speculation>
All in all it does present some rather interesting project ideas, which I personally have no time to do. 😊
Can't speak for others, but personally, any time image creation/generation is involved I work in deep-colour. This allows multiple avenues for export with a wide range a gamuts/formats including stuff like HDR. While obviously, texture memory footprint/colour depth should be considered, it's never presented me with any problems IRL on desktops. Biggest problem has always been working with !power2 textures limited texture buffer sizes, and even this problem has been solved in both hardware and software for a while. I'm OpenGL though, DirectX maybe different.
wrote:Another interesting thing to think about is Android devices. Some Android games use 16-bit color depth because it helps slow GPUs. I have seen interesting differences between hardware. For example, Intel's Atom Baytrail GPU seems to dither 16-bit color depth. I also have Tegra 2, 4 and K1 devices and the K1 is quite banded with 16-bit color depth. Don't remember what Tegra 2 or 4 look like...
Agree, given the relative size of screens and the resolutions involved, 16-bit banding is not so obvious so you can get away with not worrying about it (unless your a purist of course o.0), and dithering would be extremely beneficial. Plus the fact that if you know you have a 16-bit limitation with the display output, no need for >16-bit textures, and now you can process/load/copy twice as much in the same time consuming the same amount of power. Battery conservation is important 😀